18.600: Lecture 4 .1in Axioms of probability and inclusion...
Transcript of 18.600: Lecture 4 .1in Axioms of probability and inclusion...
![Page 1: 18.600: Lecture 4 .1in Axioms of probability and inclusion ...math.mit.edu/~sheffield/600/Lecture4.pdf18.600 Lecture 4. I Neurological: When I think \it will rain tomorrow" the ...](https://reader031.fdocuments.us/reader031/viewer/2022022501/5aa7230c7f8b9a54748ba4ba/html5/thumbnails/1.jpg)
18.600: Lecture 4
Axioms of probability andinclusion-exclusion
Scott Sheffield
MIT
18.600 Lecture 4
![Page 2: 18.600: Lecture 4 .1in Axioms of probability and inclusion ...math.mit.edu/~sheffield/600/Lecture4.pdf18.600 Lecture 4. I Neurological: When I think \it will rain tomorrow" the ...](https://reader031.fdocuments.us/reader031/viewer/2022022501/5aa7230c7f8b9a54748ba4ba/html5/thumbnails/2.jpg)
Outline
Axioms of probability
Consequences of axioms
Inclusion exclusion
18.600 Lecture 4
![Page 3: 18.600: Lecture 4 .1in Axioms of probability and inclusion ...math.mit.edu/~sheffield/600/Lecture4.pdf18.600 Lecture 4. I Neurological: When I think \it will rain tomorrow" the ...](https://reader031.fdocuments.us/reader031/viewer/2022022501/5aa7230c7f8b9a54748ba4ba/html5/thumbnails/3.jpg)
Outline
Axioms of probability
Consequences of axioms
Inclusion exclusion
18.600 Lecture 4
![Page 4: 18.600: Lecture 4 .1in Axioms of probability and inclusion ...math.mit.edu/~sheffield/600/Lecture4.pdf18.600 Lecture 4. I Neurological: When I think \it will rain tomorrow" the ...](https://reader031.fdocuments.us/reader031/viewer/2022022501/5aa7230c7f8b9a54748ba4ba/html5/thumbnails/4.jpg)
Axioms of probability
I P(A) ∈ [0, 1] for all A ⊂ S .
I P(S) = 1.
I Finite additivity: P(A ∪ B) = P(A) + P(B) if A ∩ B = ∅.I Countable additivity: P(∪∞i=1Ei ) =
∑∞i=1 P(Ei ) if Ei ∩ Ej = ∅
for each pair i and j .
18.600 Lecture 4
![Page 5: 18.600: Lecture 4 .1in Axioms of probability and inclusion ...math.mit.edu/~sheffield/600/Lecture4.pdf18.600 Lecture 4. I Neurological: When I think \it will rain tomorrow" the ...](https://reader031.fdocuments.us/reader031/viewer/2022022501/5aa7230c7f8b9a54748ba4ba/html5/thumbnails/5.jpg)
Axioms of probability
I P(A) ∈ [0, 1] for all A ⊂ S .
I P(S) = 1.
I Finite additivity: P(A ∪ B) = P(A) + P(B) if A ∩ B = ∅.I Countable additivity: P(∪∞i=1Ei ) =
∑∞i=1 P(Ei ) if Ei ∩ Ej = ∅
for each pair i and j .
18.600 Lecture 4
![Page 6: 18.600: Lecture 4 .1in Axioms of probability and inclusion ...math.mit.edu/~sheffield/600/Lecture4.pdf18.600 Lecture 4. I Neurological: When I think \it will rain tomorrow" the ...](https://reader031.fdocuments.us/reader031/viewer/2022022501/5aa7230c7f8b9a54748ba4ba/html5/thumbnails/6.jpg)
Axioms of probability
I P(A) ∈ [0, 1] for all A ⊂ S .
I P(S) = 1.
I Finite additivity: P(A ∪ B) = P(A) + P(B) if A ∩ B = ∅.
I Countable additivity: P(∪∞i=1Ei ) =∑∞
i=1 P(Ei ) if Ei ∩ Ej = ∅for each pair i and j .
18.600 Lecture 4
![Page 7: 18.600: Lecture 4 .1in Axioms of probability and inclusion ...math.mit.edu/~sheffield/600/Lecture4.pdf18.600 Lecture 4. I Neurological: When I think \it will rain tomorrow" the ...](https://reader031.fdocuments.us/reader031/viewer/2022022501/5aa7230c7f8b9a54748ba4ba/html5/thumbnails/7.jpg)
Axioms of probability
I P(A) ∈ [0, 1] for all A ⊂ S .
I P(S) = 1.
I Finite additivity: P(A ∪ B) = P(A) + P(B) if A ∩ B = ∅.I Countable additivity: P(∪∞i=1Ei ) =
∑∞i=1 P(Ei ) if Ei ∩ Ej = ∅
for each pair i and j .
18.600 Lecture 4
![Page 8: 18.600: Lecture 4 .1in Axioms of probability and inclusion ...math.mit.edu/~sheffield/600/Lecture4.pdf18.600 Lecture 4. I Neurological: When I think \it will rain tomorrow" the ...](https://reader031.fdocuments.us/reader031/viewer/2022022501/5aa7230c7f8b9a54748ba4ba/html5/thumbnails/8.jpg)
I Neurological: When I think “it will rain tomorrow” the“truth-sensing” part of my brain exhibits 30 percent of itsmaximum electrical activity.
I Frequentist: P(A) is the fraction of times A occurred duringthe previous (large number of) times we ran the experiment.
I Market preference (“risk neutral probability”): P(A) isprice of contract paying dollar if A occurs divided by price ofcontract paying dollar regardless.
I Personal belief: P(A) is amount such that I’d be indifferentbetween contract paying 1 if A occurs and contract payingP(A) no matter what.
18.600 Lecture 4
![Page 9: 18.600: Lecture 4 .1in Axioms of probability and inclusion ...math.mit.edu/~sheffield/600/Lecture4.pdf18.600 Lecture 4. I Neurological: When I think \it will rain tomorrow" the ...](https://reader031.fdocuments.us/reader031/viewer/2022022501/5aa7230c7f8b9a54748ba4ba/html5/thumbnails/9.jpg)
I Neurological: When I think “it will rain tomorrow” the“truth-sensing” part of my brain exhibits 30 percent of itsmaximum electrical activity.
I Frequentist: P(A) is the fraction of times A occurred duringthe previous (large number of) times we ran the experiment.
I Market preference (“risk neutral probability”): P(A) isprice of contract paying dollar if A occurs divided by price ofcontract paying dollar regardless.
I Personal belief: P(A) is amount such that I’d be indifferentbetween contract paying 1 if A occurs and contract payingP(A) no matter what.
18.600 Lecture 4
![Page 10: 18.600: Lecture 4 .1in Axioms of probability and inclusion ...math.mit.edu/~sheffield/600/Lecture4.pdf18.600 Lecture 4. I Neurological: When I think \it will rain tomorrow" the ...](https://reader031.fdocuments.us/reader031/viewer/2022022501/5aa7230c7f8b9a54748ba4ba/html5/thumbnails/10.jpg)
I Neurological: When I think “it will rain tomorrow” the“truth-sensing” part of my brain exhibits 30 percent of itsmaximum electrical activity.
I Frequentist: P(A) is the fraction of times A occurred duringthe previous (large number of) times we ran the experiment.
I Market preference (“risk neutral probability”): P(A) isprice of contract paying dollar if A occurs divided by price ofcontract paying dollar regardless.
I Personal belief: P(A) is amount such that I’d be indifferentbetween contract paying 1 if A occurs and contract payingP(A) no matter what.
18.600 Lecture 4
![Page 11: 18.600: Lecture 4 .1in Axioms of probability and inclusion ...math.mit.edu/~sheffield/600/Lecture4.pdf18.600 Lecture 4. I Neurological: When I think \it will rain tomorrow" the ...](https://reader031.fdocuments.us/reader031/viewer/2022022501/5aa7230c7f8b9a54748ba4ba/html5/thumbnails/11.jpg)
I Neurological: When I think “it will rain tomorrow” the“truth-sensing” part of my brain exhibits 30 percent of itsmaximum electrical activity.
I Frequentist: P(A) is the fraction of times A occurred duringthe previous (large number of) times we ran the experiment.
I Market preference (“risk neutral probability”): P(A) isprice of contract paying dollar if A occurs divided by price ofcontract paying dollar regardless.
I Personal belief: P(A) is amount such that I’d be indifferentbetween contract paying 1 if A occurs and contract payingP(A) no matter what.
18.600 Lecture 4
![Page 12: 18.600: Lecture 4 .1in Axioms of probability and inclusion ...math.mit.edu/~sheffield/600/Lecture4.pdf18.600 Lecture 4. I Neurological: When I think \it will rain tomorrow" the ...](https://reader031.fdocuments.us/reader031/viewer/2022022501/5aa7230c7f8b9a54748ba4ba/html5/thumbnails/12.jpg)
Axiom breakdown
I What if personal belief function doesn’t satisfy axioms?
I Consider an A-contract (pays 10 if candidate A wins election)a B-contract (pays 10 dollars if candidate B wins) and anA-or-B contract (pays 10 if either A or B wins).
I Friend: “I’d say A-contract is worth 1 dollar, B-contract isworth 1 dollar, A-or-B contract is worth 7 dollars.”
I Amateur response: “Dude, that is, like, so messed up.Haven’t you heard of the axioms of probability?”
I Professional response: “I fully understand and respect youropinions. In fact, let’s do some business. You sell me an Acontract and a B contract for 1.50 each, and I sell you anA-or-B contract for 6.50.”
I Friend: “Wow... you’ve beat by suggested price by 50 centson each deal. Yes, sure! You’re a great friend!”
I Axioms breakdowns are money-making opportunities.
18.600 Lecture 4
![Page 13: 18.600: Lecture 4 .1in Axioms of probability and inclusion ...math.mit.edu/~sheffield/600/Lecture4.pdf18.600 Lecture 4. I Neurological: When I think \it will rain tomorrow" the ...](https://reader031.fdocuments.us/reader031/viewer/2022022501/5aa7230c7f8b9a54748ba4ba/html5/thumbnails/13.jpg)
Axiom breakdown
I What if personal belief function doesn’t satisfy axioms?
I Consider an A-contract (pays 10 if candidate A wins election)a B-contract (pays 10 dollars if candidate B wins) and anA-or-B contract (pays 10 if either A or B wins).
I Friend: “I’d say A-contract is worth 1 dollar, B-contract isworth 1 dollar, A-or-B contract is worth 7 dollars.”
I Amateur response: “Dude, that is, like, so messed up.Haven’t you heard of the axioms of probability?”
I Professional response: “I fully understand and respect youropinions. In fact, let’s do some business. You sell me an Acontract and a B contract for 1.50 each, and I sell you anA-or-B contract for 6.50.”
I Friend: “Wow... you’ve beat by suggested price by 50 centson each deal. Yes, sure! You’re a great friend!”
I Axioms breakdowns are money-making opportunities.
18.600 Lecture 4
![Page 14: 18.600: Lecture 4 .1in Axioms of probability and inclusion ...math.mit.edu/~sheffield/600/Lecture4.pdf18.600 Lecture 4. I Neurological: When I think \it will rain tomorrow" the ...](https://reader031.fdocuments.us/reader031/viewer/2022022501/5aa7230c7f8b9a54748ba4ba/html5/thumbnails/14.jpg)
Axiom breakdown
I What if personal belief function doesn’t satisfy axioms?
I Consider an A-contract (pays 10 if candidate A wins election)a B-contract (pays 10 dollars if candidate B wins) and anA-or-B contract (pays 10 if either A or B wins).
I Friend: “I’d say A-contract is worth 1 dollar, B-contract isworth 1 dollar, A-or-B contract is worth 7 dollars.”
I Amateur response: “Dude, that is, like, so messed up.Haven’t you heard of the axioms of probability?”
I Professional response: “I fully understand and respect youropinions. In fact, let’s do some business. You sell me an Acontract and a B contract for 1.50 each, and I sell you anA-or-B contract for 6.50.”
I Friend: “Wow... you’ve beat by suggested price by 50 centson each deal. Yes, sure! You’re a great friend!”
I Axioms breakdowns are money-making opportunities.
18.600 Lecture 4
![Page 15: 18.600: Lecture 4 .1in Axioms of probability and inclusion ...math.mit.edu/~sheffield/600/Lecture4.pdf18.600 Lecture 4. I Neurological: When I think \it will rain tomorrow" the ...](https://reader031.fdocuments.us/reader031/viewer/2022022501/5aa7230c7f8b9a54748ba4ba/html5/thumbnails/15.jpg)
Axiom breakdown
I What if personal belief function doesn’t satisfy axioms?
I Consider an A-contract (pays 10 if candidate A wins election)a B-contract (pays 10 dollars if candidate B wins) and anA-or-B contract (pays 10 if either A or B wins).
I Friend: “I’d say A-contract is worth 1 dollar, B-contract isworth 1 dollar, A-or-B contract is worth 7 dollars.”
I Amateur response: “Dude, that is, like, so messed up.Haven’t you heard of the axioms of probability?”
I Professional response: “I fully understand and respect youropinions. In fact, let’s do some business. You sell me an Acontract and a B contract for 1.50 each, and I sell you anA-or-B contract for 6.50.”
I Friend: “Wow... you’ve beat by suggested price by 50 centson each deal. Yes, sure! You’re a great friend!”
I Axioms breakdowns are money-making opportunities.
18.600 Lecture 4
![Page 16: 18.600: Lecture 4 .1in Axioms of probability and inclusion ...math.mit.edu/~sheffield/600/Lecture4.pdf18.600 Lecture 4. I Neurological: When I think \it will rain tomorrow" the ...](https://reader031.fdocuments.us/reader031/viewer/2022022501/5aa7230c7f8b9a54748ba4ba/html5/thumbnails/16.jpg)
Axiom breakdown
I What if personal belief function doesn’t satisfy axioms?
I Consider an A-contract (pays 10 if candidate A wins election)a B-contract (pays 10 dollars if candidate B wins) and anA-or-B contract (pays 10 if either A or B wins).
I Friend: “I’d say A-contract is worth 1 dollar, B-contract isworth 1 dollar, A-or-B contract is worth 7 dollars.”
I Amateur response: “Dude, that is, like, so messed up.Haven’t you heard of the axioms of probability?”
I Professional response: “I fully understand and respect youropinions. In fact, let’s do some business. You sell me an Acontract and a B contract for 1.50 each, and I sell you anA-or-B contract for 6.50.”
I Friend: “Wow... you’ve beat by suggested price by 50 centson each deal. Yes, sure! You’re a great friend!”
I Axioms breakdowns are money-making opportunities.
18.600 Lecture 4
![Page 17: 18.600: Lecture 4 .1in Axioms of probability and inclusion ...math.mit.edu/~sheffield/600/Lecture4.pdf18.600 Lecture 4. I Neurological: When I think \it will rain tomorrow" the ...](https://reader031.fdocuments.us/reader031/viewer/2022022501/5aa7230c7f8b9a54748ba4ba/html5/thumbnails/17.jpg)
Axiom breakdown
I What if personal belief function doesn’t satisfy axioms?
I Consider an A-contract (pays 10 if candidate A wins election)a B-contract (pays 10 dollars if candidate B wins) and anA-or-B contract (pays 10 if either A or B wins).
I Friend: “I’d say A-contract is worth 1 dollar, B-contract isworth 1 dollar, A-or-B contract is worth 7 dollars.”
I Amateur response: “Dude, that is, like, so messed up.Haven’t you heard of the axioms of probability?”
I Professional response: “I fully understand and respect youropinions. In fact, let’s do some business. You sell me an Acontract and a B contract for 1.50 each, and I sell you anA-or-B contract for 6.50.”
I Friend: “Wow... you’ve beat by suggested price by 50 centson each deal. Yes, sure! You’re a great friend!”
I Axioms breakdowns are money-making opportunities.
18.600 Lecture 4
![Page 18: 18.600: Lecture 4 .1in Axioms of probability and inclusion ...math.mit.edu/~sheffield/600/Lecture4.pdf18.600 Lecture 4. I Neurological: When I think \it will rain tomorrow" the ...](https://reader031.fdocuments.us/reader031/viewer/2022022501/5aa7230c7f8b9a54748ba4ba/html5/thumbnails/18.jpg)
Axiom breakdown
I What if personal belief function doesn’t satisfy axioms?
I Consider an A-contract (pays 10 if candidate A wins election)a B-contract (pays 10 dollars if candidate B wins) and anA-or-B contract (pays 10 if either A or B wins).
I Friend: “I’d say A-contract is worth 1 dollar, B-contract isworth 1 dollar, A-or-B contract is worth 7 dollars.”
I Amateur response: “Dude, that is, like, so messed up.Haven’t you heard of the axioms of probability?”
I Professional response: “I fully understand and respect youropinions. In fact, let’s do some business. You sell me an Acontract and a B contract for 1.50 each, and I sell you anA-or-B contract for 6.50.”
I Friend: “Wow... you’ve beat by suggested price by 50 centson each deal. Yes, sure! You’re a great friend!”
I Axioms breakdowns are money-making opportunities.
18.600 Lecture 4
![Page 19: 18.600: Lecture 4 .1in Axioms of probability and inclusion ...math.mit.edu/~sheffield/600/Lecture4.pdf18.600 Lecture 4. I Neurological: When I think \it will rain tomorrow" the ...](https://reader031.fdocuments.us/reader031/viewer/2022022501/5aa7230c7f8b9a54748ba4ba/html5/thumbnails/19.jpg)
I Neurological: When I think “it will rain tomorrow” the“truth-sensing” part of my brain exhibits 30 percent of itsmaximum electrical activity. Should have P(A) ∈ [0, 1],maybe P(S) = 1, not necessarily P(A ∪ B) = P(A) + P(B)when A ∩ B = ∅.
I Frequentist: P(A) is the fraction of times A occurred duringthe previous (large number of) times we ran the experiment.Seems to satisfy axioms...
I Market preference (“risk neutral probability”): P(A) isprice of contract paying dollar if A occurs divided by price ofcontract paying dollar regardless. Seems to satisfy axioms,assuming no arbitrage, no bid-ask spread, complete market...
I Personal belief: P(A) is amount such that I’d be indifferentbetween contract paying 1 if A occurs and contract payingP(A) no matter what. Seems to satisfy axioms with somenotion of utility units, strong assumption of “rationality”...
18.600 Lecture 4
![Page 20: 18.600: Lecture 4 .1in Axioms of probability and inclusion ...math.mit.edu/~sheffield/600/Lecture4.pdf18.600 Lecture 4. I Neurological: When I think \it will rain tomorrow" the ...](https://reader031.fdocuments.us/reader031/viewer/2022022501/5aa7230c7f8b9a54748ba4ba/html5/thumbnails/20.jpg)
I Neurological: When I think “it will rain tomorrow” the“truth-sensing” part of my brain exhibits 30 percent of itsmaximum electrical activity. Should have P(A) ∈ [0, 1],maybe P(S) = 1, not necessarily P(A ∪ B) = P(A) + P(B)when A ∩ B = ∅.
I Frequentist: P(A) is the fraction of times A occurred duringthe previous (large number of) times we ran the experiment.Seems to satisfy axioms...
I Market preference (“risk neutral probability”): P(A) isprice of contract paying dollar if A occurs divided by price ofcontract paying dollar regardless. Seems to satisfy axioms,assuming no arbitrage, no bid-ask spread, complete market...
I Personal belief: P(A) is amount such that I’d be indifferentbetween contract paying 1 if A occurs and contract payingP(A) no matter what. Seems to satisfy axioms with somenotion of utility units, strong assumption of “rationality”...
18.600 Lecture 4
![Page 21: 18.600: Lecture 4 .1in Axioms of probability and inclusion ...math.mit.edu/~sheffield/600/Lecture4.pdf18.600 Lecture 4. I Neurological: When I think \it will rain tomorrow" the ...](https://reader031.fdocuments.us/reader031/viewer/2022022501/5aa7230c7f8b9a54748ba4ba/html5/thumbnails/21.jpg)
I Neurological: When I think “it will rain tomorrow” the“truth-sensing” part of my brain exhibits 30 percent of itsmaximum electrical activity. Should have P(A) ∈ [0, 1],maybe P(S) = 1, not necessarily P(A ∪ B) = P(A) + P(B)when A ∩ B = ∅.
I Frequentist: P(A) is the fraction of times A occurred duringthe previous (large number of) times we ran the experiment.Seems to satisfy axioms...
I Market preference (“risk neutral probability”): P(A) isprice of contract paying dollar if A occurs divided by price ofcontract paying dollar regardless. Seems to satisfy axioms,assuming no arbitrage, no bid-ask spread, complete market...
I Personal belief: P(A) is amount such that I’d be indifferentbetween contract paying 1 if A occurs and contract payingP(A) no matter what. Seems to satisfy axioms with somenotion of utility units, strong assumption of “rationality”...
18.600 Lecture 4
![Page 22: 18.600: Lecture 4 .1in Axioms of probability and inclusion ...math.mit.edu/~sheffield/600/Lecture4.pdf18.600 Lecture 4. I Neurological: When I think \it will rain tomorrow" the ...](https://reader031.fdocuments.us/reader031/viewer/2022022501/5aa7230c7f8b9a54748ba4ba/html5/thumbnails/22.jpg)
I Neurological: When I think “it will rain tomorrow” the“truth-sensing” part of my brain exhibits 30 percent of itsmaximum electrical activity. Should have P(A) ∈ [0, 1],maybe P(S) = 1, not necessarily P(A ∪ B) = P(A) + P(B)when A ∩ B = ∅.
I Frequentist: P(A) is the fraction of times A occurred duringthe previous (large number of) times we ran the experiment.Seems to satisfy axioms...
I Market preference (“risk neutral probability”): P(A) isprice of contract paying dollar if A occurs divided by price ofcontract paying dollar regardless. Seems to satisfy axioms,assuming no arbitrage, no bid-ask spread, complete market...
I Personal belief: P(A) is amount such that I’d be indifferentbetween contract paying 1 if A occurs and contract payingP(A) no matter what. Seems to satisfy axioms with somenotion of utility units, strong assumption of “rationality”...
18.600 Lecture 4
![Page 23: 18.600: Lecture 4 .1in Axioms of probability and inclusion ...math.mit.edu/~sheffield/600/Lecture4.pdf18.600 Lecture 4. I Neurological: When I think \it will rain tomorrow" the ...](https://reader031.fdocuments.us/reader031/viewer/2022022501/5aa7230c7f8b9a54748ba4ba/html5/thumbnails/23.jpg)
Outline
Axioms of probability
Consequences of axioms
Inclusion exclusion
18.600 Lecture 4
![Page 24: 18.600: Lecture 4 .1in Axioms of probability and inclusion ...math.mit.edu/~sheffield/600/Lecture4.pdf18.600 Lecture 4. I Neurological: When I think \it will rain tomorrow" the ...](https://reader031.fdocuments.us/reader031/viewer/2022022501/5aa7230c7f8b9a54748ba4ba/html5/thumbnails/24.jpg)
Outline
Axioms of probability
Consequences of axioms
Inclusion exclusion
18.600 Lecture 4
![Page 25: 18.600: Lecture 4 .1in Axioms of probability and inclusion ...math.mit.edu/~sheffield/600/Lecture4.pdf18.600 Lecture 4. I Neurological: When I think \it will rain tomorrow" the ...](https://reader031.fdocuments.us/reader031/viewer/2022022501/5aa7230c7f8b9a54748ba4ba/html5/thumbnails/25.jpg)
Intersection notation
I We will sometimes write AB to denote the event A ∩ B.
18.600 Lecture 4
![Page 26: 18.600: Lecture 4 .1in Axioms of probability and inclusion ...math.mit.edu/~sheffield/600/Lecture4.pdf18.600 Lecture 4. I Neurological: When I think \it will rain tomorrow" the ...](https://reader031.fdocuments.us/reader031/viewer/2022022501/5aa7230c7f8b9a54748ba4ba/html5/thumbnails/26.jpg)
Consequences of axioms
I Can we show from the axioms that P(Ac) = 1− P(A)?
I Can we show from the axioms that if A ⊂ B thenP(A) ≤ P(B)?
I Can we show from the axioms thatP(A ∪ B) = P(A) + P(B)− P(AB)?
I Can we show from the axioms that P(AB) ≤ P(A)?
I Can we show from the axioms that if S contains finitely manyelements x1, . . . , xk , then the values(P({x1}),P({x2}), . . . ,P({xk})
)determine the value of P(A)
for any A ⊂ S?
I What k-tuples of values are consistent with the axioms?
18.600 Lecture 4
![Page 27: 18.600: Lecture 4 .1in Axioms of probability and inclusion ...math.mit.edu/~sheffield/600/Lecture4.pdf18.600 Lecture 4. I Neurological: When I think \it will rain tomorrow" the ...](https://reader031.fdocuments.us/reader031/viewer/2022022501/5aa7230c7f8b9a54748ba4ba/html5/thumbnails/27.jpg)
Consequences of axioms
I Can we show from the axioms that P(Ac) = 1− P(A)?
I Can we show from the axioms that if A ⊂ B thenP(A) ≤ P(B)?
I Can we show from the axioms thatP(A ∪ B) = P(A) + P(B)− P(AB)?
I Can we show from the axioms that P(AB) ≤ P(A)?
I Can we show from the axioms that if S contains finitely manyelements x1, . . . , xk , then the values(P({x1}),P({x2}), . . . ,P({xk})
)determine the value of P(A)
for any A ⊂ S?
I What k-tuples of values are consistent with the axioms?
18.600 Lecture 4
![Page 28: 18.600: Lecture 4 .1in Axioms of probability and inclusion ...math.mit.edu/~sheffield/600/Lecture4.pdf18.600 Lecture 4. I Neurological: When I think \it will rain tomorrow" the ...](https://reader031.fdocuments.us/reader031/viewer/2022022501/5aa7230c7f8b9a54748ba4ba/html5/thumbnails/28.jpg)
Consequences of axioms
I Can we show from the axioms that P(Ac) = 1− P(A)?
I Can we show from the axioms that if A ⊂ B thenP(A) ≤ P(B)?
I Can we show from the axioms thatP(A ∪ B) = P(A) + P(B)− P(AB)?
I Can we show from the axioms that P(AB) ≤ P(A)?
I Can we show from the axioms that if S contains finitely manyelements x1, . . . , xk , then the values(P({x1}),P({x2}), . . . ,P({xk})
)determine the value of P(A)
for any A ⊂ S?
I What k-tuples of values are consistent with the axioms?
18.600 Lecture 4
![Page 29: 18.600: Lecture 4 .1in Axioms of probability and inclusion ...math.mit.edu/~sheffield/600/Lecture4.pdf18.600 Lecture 4. I Neurological: When I think \it will rain tomorrow" the ...](https://reader031.fdocuments.us/reader031/viewer/2022022501/5aa7230c7f8b9a54748ba4ba/html5/thumbnails/29.jpg)
Consequences of axioms
I Can we show from the axioms that P(Ac) = 1− P(A)?
I Can we show from the axioms that if A ⊂ B thenP(A) ≤ P(B)?
I Can we show from the axioms thatP(A ∪ B) = P(A) + P(B)− P(AB)?
I Can we show from the axioms that P(AB) ≤ P(A)?
I Can we show from the axioms that if S contains finitely manyelements x1, . . . , xk , then the values(P({x1}),P({x2}), . . . ,P({xk})
)determine the value of P(A)
for any A ⊂ S?
I What k-tuples of values are consistent with the axioms?
18.600 Lecture 4
![Page 30: 18.600: Lecture 4 .1in Axioms of probability and inclusion ...math.mit.edu/~sheffield/600/Lecture4.pdf18.600 Lecture 4. I Neurological: When I think \it will rain tomorrow" the ...](https://reader031.fdocuments.us/reader031/viewer/2022022501/5aa7230c7f8b9a54748ba4ba/html5/thumbnails/30.jpg)
Consequences of axioms
I Can we show from the axioms that P(Ac) = 1− P(A)?
I Can we show from the axioms that if A ⊂ B thenP(A) ≤ P(B)?
I Can we show from the axioms thatP(A ∪ B) = P(A) + P(B)− P(AB)?
I Can we show from the axioms that P(AB) ≤ P(A)?
I Can we show from the axioms that if S contains finitely manyelements x1, . . . , xk , then the values(P({x1}),P({x2}), . . . ,P({xk})
)determine the value of P(A)
for any A ⊂ S?
I What k-tuples of values are consistent with the axioms?
18.600 Lecture 4
![Page 31: 18.600: Lecture 4 .1in Axioms of probability and inclusion ...math.mit.edu/~sheffield/600/Lecture4.pdf18.600 Lecture 4. I Neurological: When I think \it will rain tomorrow" the ...](https://reader031.fdocuments.us/reader031/viewer/2022022501/5aa7230c7f8b9a54748ba4ba/html5/thumbnails/31.jpg)
Consequences of axioms
I Can we show from the axioms that P(Ac) = 1− P(A)?
I Can we show from the axioms that if A ⊂ B thenP(A) ≤ P(B)?
I Can we show from the axioms thatP(A ∪ B) = P(A) + P(B)− P(AB)?
I Can we show from the axioms that P(AB) ≤ P(A)?
I Can we show from the axioms that if S contains finitely manyelements x1, . . . , xk , then the values(P({x1}),P({x2}), . . . ,P({xk})
)determine the value of P(A)
for any A ⊂ S?
I What k-tuples of values are consistent with the axioms?
18.600 Lecture 4
![Page 32: 18.600: Lecture 4 .1in Axioms of probability and inclusion ...math.mit.edu/~sheffield/600/Lecture4.pdf18.600 Lecture 4. I Neurological: When I think \it will rain tomorrow" the ...](https://reader031.fdocuments.us/reader031/viewer/2022022501/5aa7230c7f8b9a54748ba4ba/html5/thumbnails/32.jpg)
Famous 1982 Tversky-Kahneman study (see wikipedia)
I People are told “Linda is 31 years old, single, outspoken, andvery bright. She majored in philosophy. As a student, she wasdeeply concerned with issues of discrimination and socialjustice, and also participated in anti-nuclear demonstrations.”
I They are asked: Which is more probable?I Linda is a bank teller.I Linda is a bank teller and is active in the feminist movement.
I 85 percent chose the second option.
I Could be correct using neurological/emotional definition. Or a“which story would you believe” interpretation (if witnessesoffering more details are considered more credible).
I But axioms of probability imply that second option cannot bemore likely than first.
18.600 Lecture 4
![Page 33: 18.600: Lecture 4 .1in Axioms of probability and inclusion ...math.mit.edu/~sheffield/600/Lecture4.pdf18.600 Lecture 4. I Neurological: When I think \it will rain tomorrow" the ...](https://reader031.fdocuments.us/reader031/viewer/2022022501/5aa7230c7f8b9a54748ba4ba/html5/thumbnails/33.jpg)
Famous 1982 Tversky-Kahneman study (see wikipedia)
I People are told “Linda is 31 years old, single, outspoken, andvery bright. She majored in philosophy. As a student, she wasdeeply concerned with issues of discrimination and socialjustice, and also participated in anti-nuclear demonstrations.”
I They are asked: Which is more probable?I Linda is a bank teller.I Linda is a bank teller and is active in the feminist movement.
I 85 percent chose the second option.
I Could be correct using neurological/emotional definition. Or a“which story would you believe” interpretation (if witnessesoffering more details are considered more credible).
I But axioms of probability imply that second option cannot bemore likely than first.
18.600 Lecture 4
![Page 34: 18.600: Lecture 4 .1in Axioms of probability and inclusion ...math.mit.edu/~sheffield/600/Lecture4.pdf18.600 Lecture 4. I Neurological: When I think \it will rain tomorrow" the ...](https://reader031.fdocuments.us/reader031/viewer/2022022501/5aa7230c7f8b9a54748ba4ba/html5/thumbnails/34.jpg)
Famous 1982 Tversky-Kahneman study (see wikipedia)
I People are told “Linda is 31 years old, single, outspoken, andvery bright. She majored in philosophy. As a student, she wasdeeply concerned with issues of discrimination and socialjustice, and also participated in anti-nuclear demonstrations.”
I They are asked: Which is more probable?I Linda is a bank teller.I Linda is a bank teller and is active in the feminist movement.
I 85 percent chose the second option.
I Could be correct using neurological/emotional definition. Or a“which story would you believe” interpretation (if witnessesoffering more details are considered more credible).
I But axioms of probability imply that second option cannot bemore likely than first.
18.600 Lecture 4
![Page 35: 18.600: Lecture 4 .1in Axioms of probability and inclusion ...math.mit.edu/~sheffield/600/Lecture4.pdf18.600 Lecture 4. I Neurological: When I think \it will rain tomorrow" the ...](https://reader031.fdocuments.us/reader031/viewer/2022022501/5aa7230c7f8b9a54748ba4ba/html5/thumbnails/35.jpg)
Famous 1982 Tversky-Kahneman study (see wikipedia)
I People are told “Linda is 31 years old, single, outspoken, andvery bright. She majored in philosophy. As a student, she wasdeeply concerned with issues of discrimination and socialjustice, and also participated in anti-nuclear demonstrations.”
I They are asked: Which is more probable?I Linda is a bank teller.I Linda is a bank teller and is active in the feminist movement.
I 85 percent chose the second option.
I Could be correct using neurological/emotional definition. Or a“which story would you believe” interpretation (if witnessesoffering more details are considered more credible).
I But axioms of probability imply that second option cannot bemore likely than first.
18.600 Lecture 4
![Page 36: 18.600: Lecture 4 .1in Axioms of probability and inclusion ...math.mit.edu/~sheffield/600/Lecture4.pdf18.600 Lecture 4. I Neurological: When I think \it will rain tomorrow" the ...](https://reader031.fdocuments.us/reader031/viewer/2022022501/5aa7230c7f8b9a54748ba4ba/html5/thumbnails/36.jpg)
Famous 1982 Tversky-Kahneman study (see wikipedia)
I People are told “Linda is 31 years old, single, outspoken, andvery bright. She majored in philosophy. As a student, she wasdeeply concerned with issues of discrimination and socialjustice, and also participated in anti-nuclear demonstrations.”
I They are asked: Which is more probable?I Linda is a bank teller.I Linda is a bank teller and is active in the feminist movement.
I 85 percent chose the second option.
I Could be correct using neurological/emotional definition. Or a“which story would you believe” interpretation (if witnessesoffering more details are considered more credible).
I But axioms of probability imply that second option cannot bemore likely than first.
18.600 Lecture 4
![Page 37: 18.600: Lecture 4 .1in Axioms of probability and inclusion ...math.mit.edu/~sheffield/600/Lecture4.pdf18.600 Lecture 4. I Neurological: When I think \it will rain tomorrow" the ...](https://reader031.fdocuments.us/reader031/viewer/2022022501/5aa7230c7f8b9a54748ba4ba/html5/thumbnails/37.jpg)
Outline
Axioms of probability
Consequences of axioms
Inclusion exclusion
18.600 Lecture 4
![Page 38: 18.600: Lecture 4 .1in Axioms of probability and inclusion ...math.mit.edu/~sheffield/600/Lecture4.pdf18.600 Lecture 4. I Neurological: When I think \it will rain tomorrow" the ...](https://reader031.fdocuments.us/reader031/viewer/2022022501/5aa7230c7f8b9a54748ba4ba/html5/thumbnails/38.jpg)
Outline
Axioms of probability
Consequences of axioms
Inclusion exclusion
18.600 Lecture 4
![Page 39: 18.600: Lecture 4 .1in Axioms of probability and inclusion ...math.mit.edu/~sheffield/600/Lecture4.pdf18.600 Lecture 4. I Neurological: When I think \it will rain tomorrow" the ...](https://reader031.fdocuments.us/reader031/viewer/2022022501/5aa7230c7f8b9a54748ba4ba/html5/thumbnails/39.jpg)
Inclusion-exclusion identity
I Imagine we have n events, E1,E2, . . . ,En.
I How do we go about computing something likeP(E1 ∪ E2 ∪ . . . ∪ En)?
I It may be quite difficult, depending on the application.
I There are some situations in which computingP(E1 ∪ E2 ∪ . . . ∪ En) is a priori difficult, but it is relativelyeasy to compute probabilities of intersections of any collectionof Ei . That is, we can easily compute quantities likeP(E1E3E7) or P(E2E3E6E7E8).
I In these situations, the inclusion-exclusion rule helps uscompute unions. It gives us a way to expressP(E1 ∪ E2 ∪ . . . ∪ En) in terms of these intersectionprobabilities.
18.600 Lecture 4
![Page 40: 18.600: Lecture 4 .1in Axioms of probability and inclusion ...math.mit.edu/~sheffield/600/Lecture4.pdf18.600 Lecture 4. I Neurological: When I think \it will rain tomorrow" the ...](https://reader031.fdocuments.us/reader031/viewer/2022022501/5aa7230c7f8b9a54748ba4ba/html5/thumbnails/40.jpg)
Inclusion-exclusion identity
I Imagine we have n events, E1,E2, . . . ,En.
I How do we go about computing something likeP(E1 ∪ E2 ∪ . . . ∪ En)?
I It may be quite difficult, depending on the application.
I There are some situations in which computingP(E1 ∪ E2 ∪ . . . ∪ En) is a priori difficult, but it is relativelyeasy to compute probabilities of intersections of any collectionof Ei . That is, we can easily compute quantities likeP(E1E3E7) or P(E2E3E6E7E8).
I In these situations, the inclusion-exclusion rule helps uscompute unions. It gives us a way to expressP(E1 ∪ E2 ∪ . . . ∪ En) in terms of these intersectionprobabilities.
18.600 Lecture 4
![Page 41: 18.600: Lecture 4 .1in Axioms of probability and inclusion ...math.mit.edu/~sheffield/600/Lecture4.pdf18.600 Lecture 4. I Neurological: When I think \it will rain tomorrow" the ...](https://reader031.fdocuments.us/reader031/viewer/2022022501/5aa7230c7f8b9a54748ba4ba/html5/thumbnails/41.jpg)
Inclusion-exclusion identity
I Imagine we have n events, E1,E2, . . . ,En.
I How do we go about computing something likeP(E1 ∪ E2 ∪ . . . ∪ En)?
I It may be quite difficult, depending on the application.
I There are some situations in which computingP(E1 ∪ E2 ∪ . . . ∪ En) is a priori difficult, but it is relativelyeasy to compute probabilities of intersections of any collectionof Ei . That is, we can easily compute quantities likeP(E1E3E7) or P(E2E3E6E7E8).
I In these situations, the inclusion-exclusion rule helps uscompute unions. It gives us a way to expressP(E1 ∪ E2 ∪ . . . ∪ En) in terms of these intersectionprobabilities.
18.600 Lecture 4
![Page 42: 18.600: Lecture 4 .1in Axioms of probability and inclusion ...math.mit.edu/~sheffield/600/Lecture4.pdf18.600 Lecture 4. I Neurological: When I think \it will rain tomorrow" the ...](https://reader031.fdocuments.us/reader031/viewer/2022022501/5aa7230c7f8b9a54748ba4ba/html5/thumbnails/42.jpg)
Inclusion-exclusion identity
I Imagine we have n events, E1,E2, . . . ,En.
I How do we go about computing something likeP(E1 ∪ E2 ∪ . . . ∪ En)?
I It may be quite difficult, depending on the application.
I There are some situations in which computingP(E1 ∪ E2 ∪ . . . ∪ En) is a priori difficult, but it is relativelyeasy to compute probabilities of intersections of any collectionof Ei . That is, we can easily compute quantities likeP(E1E3E7) or P(E2E3E6E7E8).
I In these situations, the inclusion-exclusion rule helps uscompute unions. It gives us a way to expressP(E1 ∪ E2 ∪ . . . ∪ En) in terms of these intersectionprobabilities.
18.600 Lecture 4
![Page 43: 18.600: Lecture 4 .1in Axioms of probability and inclusion ...math.mit.edu/~sheffield/600/Lecture4.pdf18.600 Lecture 4. I Neurological: When I think \it will rain tomorrow" the ...](https://reader031.fdocuments.us/reader031/viewer/2022022501/5aa7230c7f8b9a54748ba4ba/html5/thumbnails/43.jpg)
Inclusion-exclusion identity
I Imagine we have n events, E1,E2, . . . ,En.
I How do we go about computing something likeP(E1 ∪ E2 ∪ . . . ∪ En)?
I It may be quite difficult, depending on the application.
I There are some situations in which computingP(E1 ∪ E2 ∪ . . . ∪ En) is a priori difficult, but it is relativelyeasy to compute probabilities of intersections of any collectionof Ei . That is, we can easily compute quantities likeP(E1E3E7) or P(E2E3E6E7E8).
I In these situations, the inclusion-exclusion rule helps uscompute unions. It gives us a way to expressP(E1 ∪ E2 ∪ . . . ∪ En) in terms of these intersectionprobabilities.
18.600 Lecture 4
![Page 44: 18.600: Lecture 4 .1in Axioms of probability and inclusion ...math.mit.edu/~sheffield/600/Lecture4.pdf18.600 Lecture 4. I Neurological: When I think \it will rain tomorrow" the ...](https://reader031.fdocuments.us/reader031/viewer/2022022501/5aa7230c7f8b9a54748ba4ba/html5/thumbnails/44.jpg)
Inclusion-exclusion identity
I Can we show from the axioms thatP(A ∪ B) = P(A) + P(B)− P(AB)?
I How about P(E ∪ F ∪ G ) =P(E ) + P(F ) + P(G )−P(EF )−P(EG )−P(FG ) + P(EFG )?
I More generally,
P(∪ni=1Ei ) =n∑
i=1
P(Ei )−∑i1<i2
P(Ei1Ei2) + . . .
+ (−1)(r+1)∑
i1<i2<...<ir
P(Ei1Ei2 . . .Eir )
+ . . . + (−1)n+1P(E1E2 . . .En).
I The notation∑
i1<i2<...<irmeans a sum over all of the
(nr
)subsets of size r of the set {1, 2, . . . , n}.
18.600 Lecture 4
![Page 45: 18.600: Lecture 4 .1in Axioms of probability and inclusion ...math.mit.edu/~sheffield/600/Lecture4.pdf18.600 Lecture 4. I Neurological: When I think \it will rain tomorrow" the ...](https://reader031.fdocuments.us/reader031/viewer/2022022501/5aa7230c7f8b9a54748ba4ba/html5/thumbnails/45.jpg)
Inclusion-exclusion identity
I Can we show from the axioms thatP(A ∪ B) = P(A) + P(B)− P(AB)?
I How about P(E ∪ F ∪ G ) =P(E ) + P(F ) + P(G )−P(EF )−P(EG )−P(FG ) + P(EFG )?
I More generally,
P(∪ni=1Ei ) =n∑
i=1
P(Ei )−∑i1<i2
P(Ei1Ei2) + . . .
+ (−1)(r+1)∑
i1<i2<...<ir
P(Ei1Ei2 . . .Eir )
+ . . . + (−1)n+1P(E1E2 . . .En).
I The notation∑
i1<i2<...<irmeans a sum over all of the
(nr
)subsets of size r of the set {1, 2, . . . , n}.
18.600 Lecture 4
![Page 46: 18.600: Lecture 4 .1in Axioms of probability and inclusion ...math.mit.edu/~sheffield/600/Lecture4.pdf18.600 Lecture 4. I Neurological: When I think \it will rain tomorrow" the ...](https://reader031.fdocuments.us/reader031/viewer/2022022501/5aa7230c7f8b9a54748ba4ba/html5/thumbnails/46.jpg)
Inclusion-exclusion identity
I Can we show from the axioms thatP(A ∪ B) = P(A) + P(B)− P(AB)?
I How about P(E ∪ F ∪ G ) =P(E ) + P(F ) + P(G )−P(EF )−P(EG )−P(FG ) + P(EFG )?
I More generally,
P(∪ni=1Ei ) =n∑
i=1
P(Ei )−∑i1<i2
P(Ei1Ei2) + . . .
+ (−1)(r+1)∑
i1<i2<...<ir
P(Ei1Ei2 . . .Eir )
+ . . . + (−1)n+1P(E1E2 . . .En).
I The notation∑
i1<i2<...<irmeans a sum over all of the
(nr
)subsets of size r of the set {1, 2, . . . , n}.
18.600 Lecture 4
![Page 47: 18.600: Lecture 4 .1in Axioms of probability and inclusion ...math.mit.edu/~sheffield/600/Lecture4.pdf18.600 Lecture 4. I Neurological: When I think \it will rain tomorrow" the ...](https://reader031.fdocuments.us/reader031/viewer/2022022501/5aa7230c7f8b9a54748ba4ba/html5/thumbnails/47.jpg)
Inclusion-exclusion identity
I Can we show from the axioms thatP(A ∪ B) = P(A) + P(B)− P(AB)?
I How about P(E ∪ F ∪ G ) =P(E ) + P(F ) + P(G )−P(EF )−P(EG )−P(FG ) + P(EFG )?
I More generally,
P(∪ni=1Ei ) =n∑
i=1
P(Ei )−∑i1<i2
P(Ei1Ei2) + . . .
+ (−1)(r+1)∑
i1<i2<...<ir
P(Ei1Ei2 . . .Eir )
+ . . . + (−1)n+1P(E1E2 . . .En).
I The notation∑
i1<i2<...<irmeans a sum over all of the
(nr
)subsets of size r of the set {1, 2, . . . , n}.
18.600 Lecture 4
![Page 48: 18.600: Lecture 4 .1in Axioms of probability and inclusion ...math.mit.edu/~sheffield/600/Lecture4.pdf18.600 Lecture 4. I Neurological: When I think \it will rain tomorrow" the ...](https://reader031.fdocuments.us/reader031/viewer/2022022501/5aa7230c7f8b9a54748ba4ba/html5/thumbnails/48.jpg)
Inclusion-exclusion proof idea
I Consider a region of the Venn diagram contained in exactlym > 0 subsets. For example, if m = 3 and n = 8 we couldconsider the region E1E2E
c3 E
c4 E5E
c6 E
c7 E
c8 .
I This region is contained in three single intersections (E1, E2,and E5). It’s contained in 3 double-intersections (E1E2, E1E5,and E2E5). It’s contained in only 1 triple-intersection(E1E2E5).
I It is counted(m1
)−(m2
)+(m3
)+ . . .±
(mm
)times in the
inclusion exclusion sum.
I How many is that?
I Answer: 1. (Follows from binomial expansion of (1− 1)m.)
I Thus each region in E1 ∪ . . . ∪ En is counted exactly once inthe inclusion exclusion sum, which implies the identity.
18.600 Lecture 4
![Page 49: 18.600: Lecture 4 .1in Axioms of probability and inclusion ...math.mit.edu/~sheffield/600/Lecture4.pdf18.600 Lecture 4. I Neurological: When I think \it will rain tomorrow" the ...](https://reader031.fdocuments.us/reader031/viewer/2022022501/5aa7230c7f8b9a54748ba4ba/html5/thumbnails/49.jpg)
Inclusion-exclusion proof idea
I Consider a region of the Venn diagram contained in exactlym > 0 subsets. For example, if m = 3 and n = 8 we couldconsider the region E1E2E
c3 E
c4 E5E
c6 E
c7 E
c8 .
I This region is contained in three single intersections (E1, E2,and E5). It’s contained in 3 double-intersections (E1E2, E1E5,and E2E5). It’s contained in only 1 triple-intersection(E1E2E5).
I It is counted(m1
)−(m2
)+(m3
)+ . . .±
(mm
)times in the
inclusion exclusion sum.
I How many is that?
I Answer: 1. (Follows from binomial expansion of (1− 1)m.)
I Thus each region in E1 ∪ . . . ∪ En is counted exactly once inthe inclusion exclusion sum, which implies the identity.
18.600 Lecture 4
![Page 50: 18.600: Lecture 4 .1in Axioms of probability and inclusion ...math.mit.edu/~sheffield/600/Lecture4.pdf18.600 Lecture 4. I Neurological: When I think \it will rain tomorrow" the ...](https://reader031.fdocuments.us/reader031/viewer/2022022501/5aa7230c7f8b9a54748ba4ba/html5/thumbnails/50.jpg)
Inclusion-exclusion proof idea
I Consider a region of the Venn diagram contained in exactlym > 0 subsets. For example, if m = 3 and n = 8 we couldconsider the region E1E2E
c3 E
c4 E5E
c6 E
c7 E
c8 .
I This region is contained in three single intersections (E1, E2,and E5). It’s contained in 3 double-intersections (E1E2, E1E5,and E2E5). It’s contained in only 1 triple-intersection(E1E2E5).
I It is counted(m1
)−(m2
)+(m3
)+ . . .±
(mm
)times in the
inclusion exclusion sum.
I How many is that?
I Answer: 1. (Follows from binomial expansion of (1− 1)m.)
I Thus each region in E1 ∪ . . . ∪ En is counted exactly once inthe inclusion exclusion sum, which implies the identity.
18.600 Lecture 4
![Page 51: 18.600: Lecture 4 .1in Axioms of probability and inclusion ...math.mit.edu/~sheffield/600/Lecture4.pdf18.600 Lecture 4. I Neurological: When I think \it will rain tomorrow" the ...](https://reader031.fdocuments.us/reader031/viewer/2022022501/5aa7230c7f8b9a54748ba4ba/html5/thumbnails/51.jpg)
Inclusion-exclusion proof idea
I Consider a region of the Venn diagram contained in exactlym > 0 subsets. For example, if m = 3 and n = 8 we couldconsider the region E1E2E
c3 E
c4 E5E
c6 E
c7 E
c8 .
I This region is contained in three single intersections (E1, E2,and E5). It’s contained in 3 double-intersections (E1E2, E1E5,and E2E5). It’s contained in only 1 triple-intersection(E1E2E5).
I It is counted(m1
)−(m2
)+(m3
)+ . . .±
(mm
)times in the
inclusion exclusion sum.
I How many is that?
I Answer: 1. (Follows from binomial expansion of (1− 1)m.)
I Thus each region in E1 ∪ . . . ∪ En is counted exactly once inthe inclusion exclusion sum, which implies the identity.
18.600 Lecture 4
![Page 52: 18.600: Lecture 4 .1in Axioms of probability and inclusion ...math.mit.edu/~sheffield/600/Lecture4.pdf18.600 Lecture 4. I Neurological: When I think \it will rain tomorrow" the ...](https://reader031.fdocuments.us/reader031/viewer/2022022501/5aa7230c7f8b9a54748ba4ba/html5/thumbnails/52.jpg)
Inclusion-exclusion proof idea
I Consider a region of the Venn diagram contained in exactlym > 0 subsets. For example, if m = 3 and n = 8 we couldconsider the region E1E2E
c3 E
c4 E5E
c6 E
c7 E
c8 .
I This region is contained in three single intersections (E1, E2,and E5). It’s contained in 3 double-intersections (E1E2, E1E5,and E2E5). It’s contained in only 1 triple-intersection(E1E2E5).
I It is counted(m1
)−(m2
)+(m3
)+ . . .±
(mm
)times in the
inclusion exclusion sum.
I How many is that?
I Answer: 1. (Follows from binomial expansion of (1− 1)m.)
I Thus each region in E1 ∪ . . . ∪ En is counted exactly once inthe inclusion exclusion sum, which implies the identity.
18.600 Lecture 4
![Page 53: 18.600: Lecture 4 .1in Axioms of probability and inclusion ...math.mit.edu/~sheffield/600/Lecture4.pdf18.600 Lecture 4. I Neurological: When I think \it will rain tomorrow" the ...](https://reader031.fdocuments.us/reader031/viewer/2022022501/5aa7230c7f8b9a54748ba4ba/html5/thumbnails/53.jpg)
Inclusion-exclusion proof idea
I Consider a region of the Venn diagram contained in exactlym > 0 subsets. For example, if m = 3 and n = 8 we couldconsider the region E1E2E
c3 E
c4 E5E
c6 E
c7 E
c8 .
I This region is contained in three single intersections (E1, E2,and E5). It’s contained in 3 double-intersections (E1E2, E1E5,and E2E5). It’s contained in only 1 triple-intersection(E1E2E5).
I It is counted(m1
)−(m2
)+(m3
)+ . . .±
(mm
)times in the
inclusion exclusion sum.
I How many is that?
I Answer: 1. (Follows from binomial expansion of (1− 1)m.)
I Thus each region in E1 ∪ . . . ∪ En is counted exactly once inthe inclusion exclusion sum, which implies the identity.
18.600 Lecture 4
![Page 54: 18.600: Lecture 4 .1in Axioms of probability and inclusion ...math.mit.edu/~sheffield/600/Lecture4.pdf18.600 Lecture 4. I Neurological: When I think \it will rain tomorrow" the ...](https://reader031.fdocuments.us/reader031/viewer/2022022501/5aa7230c7f8b9a54748ba4ba/html5/thumbnails/54.jpg)
Famous hat problem
I n people toss hats into a bin, randomly shuffle, return one hatto each person. Find probability nobody gets own hat.
I Inclusion-exclusion. Let Ei be the event that ith person getsown hat.
I What is P(Ei1Ei2 . . .Eir )?
I Answer: (n−r)!n! .
I There are(nr
)terms like that in the inclusion exclusion sum.
What is(nr
) (n−r)!n! ?
I Answer: 1r ! .
I P(∪ni=1Ei ) = 1− 12! + 1
3! −14! + . . .± 1
n!
I 1−P(∪ni=1Ei ) = 1−1 + 12! −
13! + 1
4! − . . .± 1n! ≈ 1/e ≈ .36788
18.600 Lecture 4
![Page 55: 18.600: Lecture 4 .1in Axioms of probability and inclusion ...math.mit.edu/~sheffield/600/Lecture4.pdf18.600 Lecture 4. I Neurological: When I think \it will rain tomorrow" the ...](https://reader031.fdocuments.us/reader031/viewer/2022022501/5aa7230c7f8b9a54748ba4ba/html5/thumbnails/55.jpg)
Famous hat problem
I n people toss hats into a bin, randomly shuffle, return one hatto each person. Find probability nobody gets own hat.
I Inclusion-exclusion. Let Ei be the event that ith person getsown hat.
I What is P(Ei1Ei2 . . .Eir )?
I Answer: (n−r)!n! .
I There are(nr
)terms like that in the inclusion exclusion sum.
What is(nr
) (n−r)!n! ?
I Answer: 1r ! .
I P(∪ni=1Ei ) = 1− 12! + 1
3! −14! + . . .± 1
n!
I 1−P(∪ni=1Ei ) = 1−1 + 12! −
13! + 1
4! − . . .± 1n! ≈ 1/e ≈ .36788
18.600 Lecture 4
![Page 56: 18.600: Lecture 4 .1in Axioms of probability and inclusion ...math.mit.edu/~sheffield/600/Lecture4.pdf18.600 Lecture 4. I Neurological: When I think \it will rain tomorrow" the ...](https://reader031.fdocuments.us/reader031/viewer/2022022501/5aa7230c7f8b9a54748ba4ba/html5/thumbnails/56.jpg)
Famous hat problem
I n people toss hats into a bin, randomly shuffle, return one hatto each person. Find probability nobody gets own hat.
I Inclusion-exclusion. Let Ei be the event that ith person getsown hat.
I What is P(Ei1Ei2 . . .Eir )?
I Answer: (n−r)!n! .
I There are(nr
)terms like that in the inclusion exclusion sum.
What is(nr
) (n−r)!n! ?
I Answer: 1r ! .
I P(∪ni=1Ei ) = 1− 12! + 1
3! −14! + . . .± 1
n!
I 1−P(∪ni=1Ei ) = 1−1 + 12! −
13! + 1
4! − . . .± 1n! ≈ 1/e ≈ .36788
18.600 Lecture 4
![Page 57: 18.600: Lecture 4 .1in Axioms of probability and inclusion ...math.mit.edu/~sheffield/600/Lecture4.pdf18.600 Lecture 4. I Neurological: When I think \it will rain tomorrow" the ...](https://reader031.fdocuments.us/reader031/viewer/2022022501/5aa7230c7f8b9a54748ba4ba/html5/thumbnails/57.jpg)
Famous hat problem
I n people toss hats into a bin, randomly shuffle, return one hatto each person. Find probability nobody gets own hat.
I Inclusion-exclusion. Let Ei be the event that ith person getsown hat.
I What is P(Ei1Ei2 . . .Eir )?
I Answer: (n−r)!n! .
I There are(nr
)terms like that in the inclusion exclusion sum.
What is(nr
) (n−r)!n! ?
I Answer: 1r ! .
I P(∪ni=1Ei ) = 1− 12! + 1
3! −14! + . . .± 1
n!
I 1−P(∪ni=1Ei ) = 1−1 + 12! −
13! + 1
4! − . . .± 1n! ≈ 1/e ≈ .36788
18.600 Lecture 4
![Page 58: 18.600: Lecture 4 .1in Axioms of probability and inclusion ...math.mit.edu/~sheffield/600/Lecture4.pdf18.600 Lecture 4. I Neurological: When I think \it will rain tomorrow" the ...](https://reader031.fdocuments.us/reader031/viewer/2022022501/5aa7230c7f8b9a54748ba4ba/html5/thumbnails/58.jpg)
Famous hat problem
I n people toss hats into a bin, randomly shuffle, return one hatto each person. Find probability nobody gets own hat.
I Inclusion-exclusion. Let Ei be the event that ith person getsown hat.
I What is P(Ei1Ei2 . . .Eir )?
I Answer: (n−r)!n! .
I There are(nr
)terms like that in the inclusion exclusion sum.
What is(nr
) (n−r)!n! ?
I Answer: 1r ! .
I P(∪ni=1Ei ) = 1− 12! + 1
3! −14! + . . .± 1
n!
I 1−P(∪ni=1Ei ) = 1−1 + 12! −
13! + 1
4! − . . .± 1n! ≈ 1/e ≈ .36788
18.600 Lecture 4
![Page 59: 18.600: Lecture 4 .1in Axioms of probability and inclusion ...math.mit.edu/~sheffield/600/Lecture4.pdf18.600 Lecture 4. I Neurological: When I think \it will rain tomorrow" the ...](https://reader031.fdocuments.us/reader031/viewer/2022022501/5aa7230c7f8b9a54748ba4ba/html5/thumbnails/59.jpg)
Famous hat problem
I n people toss hats into a bin, randomly shuffle, return one hatto each person. Find probability nobody gets own hat.
I Inclusion-exclusion. Let Ei be the event that ith person getsown hat.
I What is P(Ei1Ei2 . . .Eir )?
I Answer: (n−r)!n! .
I There are(nr
)terms like that in the inclusion exclusion sum.
What is(nr
) (n−r)!n! ?
I Answer: 1r ! .
I P(∪ni=1Ei ) = 1− 12! + 1
3! −14! + . . .± 1
n!
I 1−P(∪ni=1Ei ) = 1−1 + 12! −
13! + 1
4! − . . .± 1n! ≈ 1/e ≈ .36788
18.600 Lecture 4
![Page 60: 18.600: Lecture 4 .1in Axioms of probability and inclusion ...math.mit.edu/~sheffield/600/Lecture4.pdf18.600 Lecture 4. I Neurological: When I think \it will rain tomorrow" the ...](https://reader031.fdocuments.us/reader031/viewer/2022022501/5aa7230c7f8b9a54748ba4ba/html5/thumbnails/60.jpg)
Famous hat problem
I n people toss hats into a bin, randomly shuffle, return one hatto each person. Find probability nobody gets own hat.
I Inclusion-exclusion. Let Ei be the event that ith person getsown hat.
I What is P(Ei1Ei2 . . .Eir )?
I Answer: (n−r)!n! .
I There are(nr
)terms like that in the inclusion exclusion sum.
What is(nr
) (n−r)!n! ?
I Answer: 1r ! .
I P(∪ni=1Ei ) = 1− 12! + 1
3! −14! + . . .± 1
n!
I 1−P(∪ni=1Ei ) = 1−1 + 12! −
13! + 1
4! − . . .± 1n! ≈ 1/e ≈ .36788
18.600 Lecture 4
![Page 61: 18.600: Lecture 4 .1in Axioms of probability and inclusion ...math.mit.edu/~sheffield/600/Lecture4.pdf18.600 Lecture 4. I Neurological: When I think \it will rain tomorrow" the ...](https://reader031.fdocuments.us/reader031/viewer/2022022501/5aa7230c7f8b9a54748ba4ba/html5/thumbnails/61.jpg)
Famous hat problem
I n people toss hats into a bin, randomly shuffle, return one hatto each person. Find probability nobody gets own hat.
I Inclusion-exclusion. Let Ei be the event that ith person getsown hat.
I What is P(Ei1Ei2 . . .Eir )?
I Answer: (n−r)!n! .
I There are(nr
)terms like that in the inclusion exclusion sum.
What is(nr
) (n−r)!n! ?
I Answer: 1r ! .
I P(∪ni=1Ei ) = 1− 12! + 1
3! −14! + . . .± 1
n!
I 1−P(∪ni=1Ei ) = 1−1 + 12! −
13! + 1
4! − . . .± 1n! ≈ 1/e ≈ .36788
18.600 Lecture 4