8.044s13 Sums of Random Variables - MIT OpenCourseWare · 4 Sums of Random Variables ... functions...

21
Sums of a Random Variables 47 4 Sums of Random Variables Many of the variables dealt with in physics can be expressed as a sum of other variables; often the components of the sum are statistically indepen- dent. This section deals with determining the behavior of the sum from the properties of the individual components. First, simple averages are used to find the mean and variance of the sum of statistically independent elements. Next, functions of a random variable are used to examine the probability density of the sum of dependent as well as independent elements. Finally, the Central Limit Theorem is introduced and discussed. Consider a sum S n of n statistically independent random variables x i . The probability densities for the n individual variables need not be identical. n S n x i i=1 p x i (ζ ) does not necessarily = p x j (ζ ) for i = j <x i >E i < (x i E 2 2 i ) >σ i The mean value of the sum is the sum of the individual means: <S n > = (x 1 + x 2 + ··· + x n ) p(x 1 ,x 2 ,...,x n ) dx 1 dx 2 ··· dx n p 1 (x 1 )p 2 (x 2 ) pn(xn) ··· n = [ x i p i (x i ) dx i ][ p j i=1 j (x ) dx j ] i=j E i 1 n = E i i=1 The variance of the sum is the sum of the individual variances: V ar(S n )= < (S n <S n >) 2 > = < (x 1 E 1 + x 2 E 2 + ··· + x n E n ) 2 >

Transcript of 8.044s13 Sums of Random Variables - MIT OpenCourseWare · 4 Sums of Random Variables ... functions...

Page 1: 8.044s13 Sums of Random Variables - MIT OpenCourseWare · 4 Sums of Random Variables ... functions of a random variable are used to examine the probability ... are each represented

Sums of a Random Variables 47

4 Sums of Random Variables

Many of the variables dealt with in physics can be expressed as a sum ofother variables; often the components of the sum are statistically indepen-dent. This section deals with determining the behavior of the sum from theproperties of the individual components. First, simple averages are used tofind the mean and variance of the sum of statistically independent elements.Next, functions of a random variable are used to examine the probabilitydensity of the sum of dependent as well as independent elements. Finally,the Central Limit Theorem is introduced and discussed.

Consider a sum Sn of n statistically independent random variables xi.The probability densities for the n individual variables need not be identical.

n

Sn ≡∑

xi

i=1

pxi(ζ) does not necessarily = pxj

(ζ) for i = j

< xi >≡ Ei < (xi − E 2 2i) >≡ σi

The mean value of the sum is the sum of the individual means:

< Sn > =∫

(x1 + x2 + · · · + xn) p(x1, x2, . . . , xn) dx1 dx2 · · · dxn

p1(x1)p2(x2) pn(xn)

︸ ︷︷ ︸···

∑n

= [∫

xipi(xi) dxi][ p j

i=1

∏j(x ) dxj]︸ ︷︷ ︸ i=j

∫Ei

︸ ︷︷1

n

︸=

∑Ei

i=1

The variance of the sum is the sum of the individual variances:

V ar(Sn) = < (Sn− < Sn >)2 >

= < (x1 − E1 + x2 − E2 + · · · + xn − En)2 >

Page 2: 8.044s13 Sums of Random Variables - MIT OpenCourseWare · 4 Sums of Random Variables ... functions of a random variable are used to examine the probability ... are each represented

48 Probability

The right hand side of the above expression is a sum of terms, each quadraticin x, having the following form:

< (xi − Ei)(xj − Ej) > = < xixj > − < xi > Ej− < xj > Ei + EiEj

= < xixj > −EiEj

=

{< x2 2 2

i > −Ei = σi if i = j0 if i = j

Therefore,n

V ar(Sn) =∑

σ2i .

i=1

In the special case where all the individual xi’s have the same probabilitydensity the above results reduce to

< Sn >= nE 2i , V ar(Sn) = nσi .

Physically, the width of p(Sn) grows as√

n while the mean of Sn grows asn. The probability density becomes more concentrated about the mean as nincreases. If n is very large, the distribution develops a sharp narrow peakat the location of the mean.

Now turn to the problem of finding the entire probability density, pS(α),for the sum of two arbitrary random variables x and y represented by the

joint density px,y(ζ, η). This is a straight forward application of functions ofa random variable.

Page 3: 8.044s13 Sums of Random Variables - MIT OpenCourseWare · 4 Sums of Random Variables ... functions of a random variable are used to examine the probability ... are each represented

Sums of a Random Variables 49

∫ ∞ ∫ α−ζ

PS(α) = dζ dη px,y(ζ, η)−∞ −∞

dps(α) = PS(α)

=∫ ∞

dζ px,y(ζ, α−∞

− ζ)

In this expression p(S) is given as a single integral over the joint densityp(x, y). The result is valid even when x and y are statistically dependent.The result simplifies somewhat when x and y are statistically independent,allowing the joint density to be factored into the product of two individualdensities.

pS(α) =∫ ∞

dζ px(ζ)py(α ζ) if x and y are S.I.−∞

The integral operation involved in the last expression is known as convolu-tion. The probability density for the sum of two S.I. random variables isthe convolution of the densities of the two individual variables. Convolu-tion appears in other disciplines as well. The transient output of a linearsystem (such as an electronic circuit) is the convolution of the impulse re-sponse of the system and the input pulse shape. The recorded output of alinear spectrometer (such as a grating spectrograph) is the convolution of theinstrumental profile and the intrinsic spectrum being measured.

The convolution of two functions p(x) and q(x) is designated by ⊗ andis defined as

p ⊗ q ≡∫ ∞

p(z)q(x−∞

− z)dz.

It is easy to show from this expression that convolution is commutative, thatis, the result does not depend on which function is taken first:

a ⊗ b = b ⊗ a.

Convolution is also distributive,

a ⊗ (b + c) = a ⊗ b + a ⊗ c,

Page 4: 8.044s13 Sums of Random Variables - MIT OpenCourseWare · 4 Sums of Random Variables ... functions of a random variable are used to examine the probability ... are each represented

50 Probability

and associative,a ⊗ (b ⊗ c) = (a ⊗ b) ⊗ c.

Perhaps the best way to visualize the convolution integral is in terms ofa series of successive steps.

1. FLIP q(z) ABOUT THE ORIGIN OF ITS ARGUMENT TO FORMTHE MIRROR IMAGE q(−z).

2. SHIFT q(−z) TO THE RIGHT BY AN AMOUNT x TO FORMq( z + x).−

3. MULTIPLY q(x − z) BY p(z).

4. INTEGRATE THE PRODUCT AND PLOT THE RESULT.

Page 5: 8.044s13 Sums of Random Variables - MIT OpenCourseWare · 4 Sums of Random Variables ... functions of a random variable are used to examine the probability ... are each represented

Sums of a Random Variables 51

5. REPEAT 2 THROUGH 4 FOR DIFFERENT VALUES OF x.

If p(z) and q(z) are each represented by a single analytic function for all zone can work out the convolution integral mathematically with little thoughtgiven to the above graphic procedure. On the other hand, when the inputfunctions are zero in some regions or are only piecewise continuous, then thegraphic procedure is useful in setting the proper limits of integration.

Example: Sum of Two Uniformly Distributed Variables

Given x and y are two statistically independent random variables, uni-formly distributed in the regions |x| ≤ a and |y| ≤ b.

Problem Find the probability density p(S) for the sum S = x + y.

Solution The form of the integral will depend on the value of S. Threeseparate regions need to be considered. Assume as indicated above thatb < a.

0 < S < a − b

Page 6: 8.044s13 Sums of Random Variables - MIT OpenCourseWare · 4 Sums of Random Variables ... functions of a random variable are used to examine the probability ... are each represented

52 Probability

p(S) =∫ S+b 1 1 1

( )( ) dx =S−b 2a 2b 2a

a b < S < a + b−

a

p(S) =∫ 1 1 a + b

( )( ) dx =− S

S−b 2a 2b (2a)(2b)

a + b < S

p(S) = 0 since there is no overlap

The symmetry of the problem demands that the result be even in S. The

final result is plotted below.

Page 7: 8.044s13 Sums of Random Variables - MIT OpenCourseWare · 4 Sums of Random Variables ... functions of a random variable are used to examine the probability ... are each represented

Sums of a Random Variables 53

When the widths of the two probability densities are the same, the densityfor the sum becomes triangular.

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .Example: Classical Intensity of Unpolarized Light

Given An unpolarized beam of thermal light can be viewed as two sta-tistically independent beams of equal average intensity, polarized at rightangles to each other. The total intensity IT is the sum of the intensities ofeach of the two polarized components, I1 and I2.

Problem Find the probability density for IT using the result from a pre-vious example that

1p(I1) = √ exp[−I1/2α] I1 > 0

2παI1

= 0 I1 < 0

where α =< I1 >.

Solution Since the light is thermal and unpolarized, the intensity in eachof the two polarization directions has the same density: pI1(ζ) = pI2(ζ).

Page 8: 8.044s13 Sums of Random Variables - MIT OpenCourseWare · 4 Sums of Random Variables ... functions of a random variable are used to examine the probability ... are each represented

54 Probability

I I

I I

II

II

I II

∞p(IT ) =

∫pI1(I1)pI2(IT

−∞− I1) dI1

∫ IT

(exp[−I1/2α] exp[ (IT I1)/2α]

=− −

0√

2παI1

) ⎛⎝

2πα(IT − I1)

⎞⎠ dI1

1 ∫ I

−T

= exp[ IT /2α] [I1(IT − I1)]−1/2 dI1

2πα 0

1 2 1 1= exp[−IT /2α

∫ IT /

] ( I2 2

2πα T−IT 2− x )−1/2 dx using x 1

/ 4≡ I − IT

2

π

1= exp[

︸ ︷︷ ︸

2α−IT /2α] IT ≥ 0

= 0 IT < 0

Page 9: 8.044s13 Sums of Random Variables - MIT OpenCourseWare · 4 Sums of Random Variables ... functions of a random variable are used to examine the probability ... are each represented

Sums of a Random Variables 55

I

I

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .In the two examples just considered the variables being summed had

probability densities of the same functional form, rectangles for instance.Certainly this will not always be the case; for example one might be interestedin the sum of two variables, one having a uniform density and the other havinga Gaussian density. Yet even when the input variables do have probabilitydensities of identical form, the density of the sum will in general have adifferent functional form than that of the input variables. The two previousexamples illustrate this point.

There are three special cases, however, where the functional form of thedensity is preserved during the addition of statistically independent, similarlydistributed variables. The sum of two Gaussian variables is Gaussian. Thisis shown in an example below. Simply knowing that the result is Gaussian,though, is enough to allow one to predict the parameters of the density.Recall that a Gaussian is completely specified by its mean and variance. Thefact that the means and variances add when summing S.I. random variablesmeans that the mean of the resultant Gaussian will be the sum of the inputmeans and the variance of the sum will be the sum of the input variances.

The sum of two S.I. Poisson random variables is also Poisson. Here again,knowing that the result is Poisson allows one to determine the parametersin the sum density. Recall that a Poisson density is completely specified byone number, the mean, and the mean of the sum is the sum of the means.

Page 10: 8.044s13 Sums of Random Variables - MIT OpenCourseWare · 4 Sums of Random Variables ... functions of a random variable are used to examine the probability ... are each represented

56 Probability

A Lorentzian (or Cauchy) density depends on two parameters and is givenby

Γ 1p(x) =

π (x − m)2 + Γ2

m =< x >

Γ = half width at half height

The sum of two S.I. Lorentzian random variables is Lorentzian with mS =m1+m2 and ΓS = Γ1+Γ2. Note that the widths add! Doesn’t this contradictthe rule about the variances adding, which implies that the widths shouldgrow as the square root of the sum of the squared widths? This apparentcontradiction can be resolved by trying to calculate the variance for theLorentzian density.

Example: Sum of Two S.I. Gaussian Random Variables

Given1

p(x) = √ exp[−(x − E )2x /2σ2

x]2πσ2

x

1p(y) = √ exp[−(y − E 2 2

y) /2σy ]2πσ2

y

Problem Find p(S) where S ≡ x + y.

Solution

∞p(S) =

∫px(x)py(S

−∞− x) dx

= (2πσxσy)−1

∫ ∞exp[

−∞−(x − Ex)

2/2σ2x − (S − x − Ey)

2/2σ2y ] dx

The argument of the exponent in the integral is quadratic in the integrationvariable x. It can be simplified by changing variables, ζ ≡ x − Ex, anddefining a temporary quantity a ≡ S − (Ex + Ey), a constant as far as the

Page 11: 8.044s13 Sums of Random Variables - MIT OpenCourseWare · 4 Sums of Random Variables ... functions of a random variable are used to examine the probability ... are each represented

Sums of a Random Variables 57

integration is concerned. The argument of the exponent can then be writtenand processed as follows.

1[ζ2 (a− +

2 σ2

− ζ)2

x σ2y

]=

1 σ2 + σ2 2a a2

− x y ζ2 ζ + =2

[σ2

xσ2

−σ2

y y σ2y

]

1−[{

σ2x + σ2 2 2

y 2a σ a a2 σ2a2

ζ2 ζ2

− + x x =σ2σ2 σ2 σ2 σ2 + σ2)

}+

(

{σ2

−σ2 2

x y y y x y y y(σx + σ2y)

}]

1⎡⎧⎨(

σ2σ2)−1

− x

2⎣ y

( 2σ2

ζ xa2

−) ⎫⎬

+

{a⎩ σ2 + σ2 σ2 + σ2 ⎭ σ2 + σ2

x y x y x y

}⎤

The second of the two terms in

⎦{ } brackets in the last line produces a

constant which factors out of the integral. The integral over ζ coming aboutfrom the first term in { } brackets is just the normalization integral for aGaussian and has the value (

σ2

2π xσ2 1/2y .

σ2 + σ2x y

)

Bringing all of these terms together gives the probability density for the sum.

p(S) = (2πσxσy)−1 a2 σ2

exp[− ] 2π xσ2 1/2y

2(σ2 + σ2x y)

(σ2 + σ2

x y

)

1 (S − (E= √ exp[ ]

2π(σ2 + σ2)− x + Ey))

2

2(σ2x + σ2

y)x y

The result is a Gaussian with mean Ex + Ey and variance σ2x + σ2

y .

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .The result above applies to the sum of two statistically independent Gaus-

sian variables. It is a remarkable and important fact that the sum of twostatistically dependent Gaussian variables is also a Gaussian if the two inputvariables have a bivariate joint probability density (the special joint densityintroduced in an earlier example). Those with a masochistic streak and aflair for integration can prove this by applying the expression for the densityof the sum of two dependent variables to the bivariate Gaussian joint density.

Page 12: 8.044s13 Sums of Random Variables - MIT OpenCourseWare · 4 Sums of Random Variables ... functions of a random variable are used to examine the probability ... are each represented

58 Probability

Mathematical Digression: Convolution and Fourier Transforms

After the algebra of the last example it is reasonable to ask if there is someeasier way to compute convolutions. There is, and it involves Fourier trans-forms. This digression is intended for interested students who are familiarwith Fourier techniques.

Let the Fourier transform of f(x) be F (k) and use the notation f ↔ Fto indicate a transform pair.

F (k) ≡∫ ∞

eikxf(x) dx−∞

1f(x) =

∫ ∞e−ikxF (k) dk

2π −∞

(The Fourier transform of a probability density is called the characteristicfunction or moment generating function and is quite useful for more advancedtopics in probability theory.)

One can show that the Fourier transform of the convolution of two func-tions is the product of the individual Fourier transforms.

If a ↔ A

and b ↔ B

then a ⊗ b ↔ AB

To find the probability density for the sum of two statistically independentrandom variables one can multiply the Fourier transforms of the individualprobability densities and take the inverse transform of the product. As apractical application and example one can show that the sums of Gaussiansare Gaussian, sums of Poisson variables are Poisson, and sums of Lorentziansare Lorentzian.

Page 13: 8.044s13 Sums of Random Variables - MIT OpenCourseWare · 4 Sums of Random Variables ... functions of a random variable are used to examine the probability ... are each represented

Sums of a Random Variables 59

Gaussian

1 (x σ√ exp[− E)2 2k2

− ] ↔ exp[ikE2πσ2 2σ2

− ]2

Poisson ∑∞ 1λne−λδ(x exp[

=0 n!− n)

n

↔ λ(eik − 1)]

LorentzianΓ 1

exp[imk kΛ ]π (x − m)2 + Γ2

↔ − | |

Each of these three transforms F (k) has the property that a product ofsimilar functions preserves the functional form; only the parameters change.For example if pG(S) represents the sum of two Gaussians, then

σ2k2 σ2k2

pG(S) ↔ exp[ikE − 1 ] exp[ikE22

− 21 ]

2

(σ2 + σ2)k2

↔ exp[ik(E1 + E ) 12 − 2 ]

2

The last expression is the Fourier transform of a Gaussian of mean E1 + E2

and variance σ2 + σ21 2. Check to see that the transforms for the Poisson and

Lorentzian behave in a similar manner.

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

Page 14: 8.044s13 Sums of Random Variables - MIT OpenCourseWare · 4 Sums of Random Variables ... functions of a random variable are used to examine the probability ... are each represented

60 Probability

Simple averaging revealed that for statistically independent random vari-ables the mean of the sum is the sum of the means and the variance is thesum of the variances. The Central Limit Theorem gives information aboutthe functional form of the resulting probability density.

Central Limit Theorem (non-mathematical form)Let Sn be the sum of n statistically independent, identically dis-tributed random variables of mean Ex and variance σ2

x (whichmust be finite). For large n, p(Sn) can often be well representedby a Gaussian with mean nEx and variance nσ2

x.

1 )p(Sn)

−(S 2

≈ √ exp[− n − nEx]

2nσ22πnσ2 xx

The vague wording of the theorem as presented here shows immediately whyit is qualified by the adjective “non-mathematical”. Yet this is perhaps themost useful form of the theorem for our purposes. Consider the specificwording used here.

• σx (which must be finite)This excludes cases such as Lorentzian random variables where the den-sity falls off so slowly with x that the variance is infinite. Of course, forLorentzians one does not need a Central Limit Theorem for it shouldbe clear from earlier discussion in this section that the sum of n statis-tically independent Lorentzian random variables will be Lorentzian forany value of n.

• For large nHow large? Mathematicians would be more specific. It will be shownbelow that for practical purposes convergence with increasing n is oftenquite rapid.

Page 15: 8.044s13 Sums of Random Variables - MIT OpenCourseWare · 4 Sums of Random Variables ... functions of a random variable are used to examine the probability ... are each represented

Sums of a Random Variables 61

• can often be well represented byCertainly not the words of a mathematician! However, the vaguenessis deliberate. The Gaussian result presented is a continuous functionof the sum variable Sn. But a sum of n discrete random variableswill always be discrete, no matter how large n might be. A continuousfunction is not going to converge in any mathematical sense to a sum ofδ functions. What actually happens in such cases is that the envelope ofthe comb of δ functions approaches a Gaussian shape. A course grainedaverage of p(Sn) (averaging over a number of discrete values at a time)would approach the continuous density presented in the theorem.

So much for the limitations of the Central Limit Theorem; now considersome extensions.

• The Gaussian can be a good practical approximation for modest valuesof n. The examples which follow illustrate this point.

• The Central Limit Theorem may work even if the individual membersof the sum are not identically distributed. A prerequisite is that noindividual member should dominate the sum. The necessity for thisrestriction can be visualized easily. Consider adding 6 variables, eachuniformly distributed between -1 and 1, and a 7th variable, uniformbetween -100 and 100. The density for the sum will look rectangularwith rounded edges near ±100.

• The requirement that the variables be statistically independent mayeven be waived in some cases, particularly when n is very large.

The Central Limit Theorem plays a very important role in science andengineering. Knowingly or unknowingly, it is often the basis for the assump-tion that some particular physical random variable has a Gaussian probabilitydensity.

Page 16: 8.044s13 Sums of Random Variables - MIT OpenCourseWare · 4 Sums of Random Variables ... functions of a random variable are used to examine the probability ... are each represented

62 Probability

Example: Sum of Four Uniformly Distributed Variables

Consider a random variable x which is uniformly distributed between −aand a.

The sum of only four such variables has a density which is reasonably close tothe CLT approximation. An earlier example found the density for a sum oftwo of these variables to be triangular. Since convolution is associative, thedensity of the sum of four can be found by convolving the triangular densitywith itself. The result is as follows.

p(S) = 1 3 2 396a4 (32a − 12aS + 3|S| ) 0 ≤ |S| ≤ 2a

= 196a4 (4a − |S|)3 2a ≤ |S| ≤ 4a

= 0 4a ≤ |S|

The probability density for the sum is plotted below for a = 1. In thiscase the mean of the single random variable is zero and its variance is 1/3.The CLT approximation has also been plotted, a Gaussian with zero meanand variance equal to 4/3. Even though the density for a single variable isdiscontinuous, the density for the sum is quite smooth. The good match tothe Gaussian in this example when the number of variables in the sum, n,is as small as 4 is due to the fact that the single density is symmetric andfalls off rapidly with increasing argument. But note that the true densityfor the sum is zero beyond |S| = 4 while the Gaussian is finite. One couldsay that the percentage error is infinite beyond |S| = 4. Well, you can’thave everything. This does illustrate, however, the problems encounteredin formulating a mathematical statement of the convergence of the CentralLimit Theorem.

Page 17: 8.044s13 Sums of Random Variables - MIT OpenCourseWare · 4 Sums of Random Variables ... functions of a random variable are used to examine the probability ... are each represented

Sums of a Random Variables 63

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .Example: Sums of Exponentially Distributed Variables

A single sided exponential is an example of a density that is both discon-tinuous and asymmetric. It falls off more gradually with increasing argumentthan does a rectangular or Gaussian density. In this case convergence to theCLT approximation requires a larger number of elements, n, in the sum.

1p(x) = e−x/a x > 0

a

= 0 x < 0

Here the mean is a and the variance is a2. This density is sufficiently simplethat the repeated convolution necessary to find the density for the sum of nidentical statistically independent variables can be carried out analytically.

Sn−1

p(S) =

[−

(n 1)!an

]e S/a S ≥ 0−

= 0 S < 0

This result is shown below for a = 1 and several values of n. The Gaussianapproximation is also shown for comparison.

Page 18: 8.044s13 Sums of Random Variables - MIT OpenCourseWare · 4 Sums of Random Variables ... functions of a random variable are used to examine the probability ... are each represented

64 Probability

2 4 6 8 100

0.05

0.1

0.15

0.2

0.25

5 10 15 200

0.04

0.08

0.12

0.16

10 20 30 400

0.02

0.04

0.06

0.08

0.1

20 40 60 800

0.02

0.04

0.06

0.08

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

Page 19: 8.044s13 Sums of Random Variables - MIT OpenCourseWare · 4 Sums of Random Variables ... functions of a random variable are used to examine the probability ... are each represented

Sums of a Random Variables 65

Example: Poisson Density for Large < n >

The Central Limit Theorem applied to sums of discrete random variablesstates that the envelope of the train of δ functions representing p(S) shouldapproach a Gaussian. Here the Poisson density is used to illustrate thisbehavior. The graphs on the following page show the Poisson density forseveral values of the mean number of events < n >. (Recall that for a Poissondensity, the variance is equal to the mean.) The dots indicate the coefficientsof the delta functions that would be located at each positive integer value ofS. The dashed lines show the CLT approximation

1 (Sp(s) = √ exp[

− < n >)2

2π < n >− ].

2 < n >

Since the sum of Poisson random variables also has a Poisson density, thecase for which < n >= 20 could be the result of adding 10 variables of mean2 or of adding 5 variables of mean 4. For the Poisson, one can see that itis the mean of the final density, not the number of statistically independentelements in the sum, which determines the convergence to the CLT result.The envelope of the density of a single Poisson random variable will approacha Gaussian if the mean is large enough!

The fact that a single Poisson density approaches a Gaussian envelope forlarge mean is not typical of discrete densities. Consider the Bose-Einstein (orgeometric) density that was treated in a homework problem. For that densitythe probability of obtaining a given integer decreases monotonically withincreasing value, no matter how large the mean of the density. In contrast,the density for the sum of 10 identical Bose-Einstein random variables, eachwith mean 2, would not be of the Bose-Einstein form and, following the CLT,would exhibit a peak in its envelope near S = 20.

Page 20: 8.044s13 Sums of Random Variables - MIT OpenCourseWare · 4 Sums of Random Variables ... functions of a random variable are used to examine the probability ... are each represented

66 Probability

20 40 60 800

0.02

0.04

0.06

0.08

10 20 30 400

0.02

0.04

0.06

0.08

0.1

5 10 15 20 250

0.04

0.08

0.12

0.16

3 6 9 12 150

0.04

0.08

0.12

0.16

0.2

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

Page 21: 8.044s13 Sums of Random Variables - MIT OpenCourseWare · 4 Sums of Random Variables ... functions of a random variable are used to examine the probability ... are each represented

MIT OpenCourseWarehttp://ocw.mit.edu

8.044 Statistical Physics ISpring 2013

For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms.