Chapter 7 Random Variables 7.1: Discrete and Continuous Random Variables.
4.1 Random Variables
Transcript of 4.1 Random Variables
![Page 1: 4.1 Random Variables](https://reader033.fdocuments.us/reader033/viewer/2022043020/626bc87493de980c8f7cec9c/html5/thumbnails/1.jpg)
4.1 Random Variables
A random variable is a real-valued function defined on the sample space S.
𝑃(𝑋 = 𝑖) = (1 − 𝑝)𝑖𝑝, 𝑖 = 1, 2, … , 𝑛 − 1
![Page 2: 4.1 Random Variables](https://reader033.fdocuments.us/reader033/viewer/2022043020/626bc87493de980c8f7cec9c/html5/thumbnails/2.jpg)
Suppose that there are N distinct types of coupons and that each time one obtains a coupon, it is, independently of
previous selections, equally likely to be any one of the N types. A random variable of interest is T, the number of
coupons that needs to be collected until one obtains a complete set of at least one of each type.
Solution: Fix n and define the events A1,A2, . . . ,AN as follows: Aj is the event that no type j coupon is contained
among the first n coupons collected. Then
Hence
and
![Page 3: 4.1 Random Variables](https://reader033.fdocuments.us/reader033/viewer/2022043020/626bc87493de980c8f7cec9c/html5/thumbnails/3.jpg)
Cumulative Distribution Function
Properties of F(x) [section 4.9]
5.
𝑃{ 𝑋 ≤ 𝑏} = 𝐹(𝑏), 𝑃{ 𝑋 < 𝑏} = 𝐹(𝑏−) = lim𝑥↗𝑏
𝐹(𝑥)
![Page 4: 4.1 Random Variables](https://reader033.fdocuments.us/reader033/viewer/2022043020/626bc87493de980c8f7cec9c/html5/thumbnails/4.jpg)
Answers: (a) 11/12 (b) 1/6 (c) ¾ (d) 1/12
4.2-5 Discrete Random Variables
![Page 5: 4.1 Random Variables](https://reader033.fdocuments.us/reader033/viewer/2022043020/626bc87493de980c8f7cec9c/html5/thumbnails/5.jpg)
Example. Graph of the pmf of the random variable representing the sum when two dice are rolled
Example. Suppose that the probability mass function of X is
![Page 6: 4.1 Random Variables](https://reader033.fdocuments.us/reader033/viewer/2022043020/626bc87493de980c8f7cec9c/html5/thumbnails/6.jpg)
Expected Value (or mean)
If X is a discrete random variable having a probability mass function p(x), then the expectation, or
the expected value, of X, denoted by E[X], is defined by
µ =
E[X] = (-1)×.10 + 0×.25 + 1×.30 + 2×.35 = .90
Another motivation of the definition of expectation is provided by the frequency
interpretation of probabilities. Think of X as representing our winnings in a single game of
chance. That is, with probability p(xi) we shall win xi units i = 1, 2, . . . , n. By the
frequency interpretation, if we play this game continually, then the proportion of time
that we win xi will be p(xi). Since this is true for all i, i = 1, 2, . . . , n, it follows that our
average winnings per game will be
∑ 𝑥𝑖𝑝(𝑥𝑖) = 𝐸[𝑋]
𝑛
𝑖=1
![Page 7: 4.1 Random Variables](https://reader033.fdocuments.us/reader033/viewer/2022043020/626bc87493de980c8f7cec9c/html5/thumbnails/7.jpg)
Expectation of a function of a random variable
If X is a random variable, then for any real function g defined on the range of X, Y = g(X) is a random variable.
𝐸[𝑌] = ∑ 𝑦𝑖𝑃(𝑌 = 𝑦𝑖)
𝑗
Solution 1: Solution 2:
![Page 8: 4.1 Random Variables](https://reader033.fdocuments.us/reader033/viewer/2022043020/626bc87493de980c8f7cec9c/html5/thumbnails/8.jpg)
A computation on page 130 shows that
![Page 9: 4.1 Random Variables](https://reader033.fdocuments.us/reader033/viewer/2022043020/626bc87493de980c8f7cec9c/html5/thumbnails/9.jpg)
Variance
Here µ = E(X). From Proposition 4.1 it follows that a discrete random variable
Important property:
![Page 10: 4.1 Random Variables](https://reader033.fdocuments.us/reader033/viewer/2022043020/626bc87493de980c8f7cec9c/html5/thumbnails/10.jpg)