Add Math Project Work 2016

download Add Math Project Work 2016

of 12

  • date post

    02-Mar-2018
  • Category

    Documents

  • view

    215
  • download

    0

Transcript of Add Math Project Work 2016

  • 7/26/2019 Add Math Project Work 2016

    1/12

    KERJA PROJEK MATEMATIK TAMBAHAN 2016

    NEGERI P.PINANG (TIMUR LAUT) - ANSWER

    http://addmathsprojectwork.blogspot.my/

    PART 1a)

    INTRODUCTIONProbability is a way of expressing knowledge or belief that an event will occur or has occurred.In mathematics the concept has been given an exact meaning in probability theory, that is usedextensively in such areas of study as mathematics, statistics, finance, gambling, science, andphilosophy to draw conclusions about the likelihood of potential events and the underlyingmechanics of complex systems.

    Probability has a dual aspect: on the one hand the probability or likelihood of hypotheses giventhe evidence for them and on the other hand the behavior of stochastic processes such as thethrowing of dice or coins. The study of the former is historically older in, for example, the law ofevidence, while the mathematical treatment of dice began with the work of Pascal and Fermat inthe 1650s. Probability is distinguished from statistics. While statistics deals with data andinferences from it, (stochastic) probability deals with the stochastic (random) processes which liebehind data or outcomes.

    HISTORYProbableand likelyand their cognates in other modern languages derive from medieval learnedLatinprobabilisand verisimilis, deriving fromCicero and generally applied to an opinion tomeanplausibleorgenerally approved.Ancient and medievallaw of evidence developed a grading of degrees of proof, probabilities,presumptions andhalf-proof to deal with the uncertainties of evidence in court. InRenaissancetimes, betting was discussed in terms of odds such as "ten to one" and maritimeinsurancepremiums were estimated based on intuitive risks, but there was no theory on how to calculatesuch odds or premiums.

    The mathematical methods of probability arose in the correspondence ofPierre de Fermat andBlas Pascal (1654) on such questions as the fair division of the stake in an interrupted game ofchance.Christiaan Huygens (1657) gave a comprehensive treatment of the subject.Jacob Bernoulli'sArs Conjectandi(posthumous, 1713) andAbraham de Moivre'sThe Doctrineof Chances(1718) put probability on a sound mathematical footing, showing how to calculate awide range of complex probabilities. Bernoulli proved a version of the fundamentallaw of largenumbers,which states that in a large number of trials, the average of the outcomes is likely to bevery close to the expected value - for example, in 1000 throws of a fair coin, it is likely that thereare close to 500 heads (and the larger the number of throws, the closer to half-and-half theproportion is likely to be).

    The power of probabilistic methods in dealing with uncertainty was shown byGauss'sdetermination of the orbit ofCeres from a few observations. Thetheory of errors used the

    http://en.wikipedia.org/wiki/Cicerohttp://en.wikipedia.org/wiki/Law_of_evidencehttp://en.wikipedia.org/wiki/Half-proofhttp://en.wikipedia.org/wiki/Renaissancehttp://en.wikipedia.org/wiki/Insurancehttp://en.wikipedia.org/wiki/Pierre_de_Fermathttp://en.wikipedia.org/wiki/Blaise_Pascalhttp://en.wikipedia.org/wiki/Christiaan_Huygenshttp://en.wikipedia.org/wiki/Jacob_Bernoullihttp://en.wikipedia.org/wiki/Ars_Conjectandihttp://en.wikipedia.org/wiki/Ars_Conjectandihttp://en.wikipedia.org/wiki/Ars_Conjectandihttp://en.wikipedia.org/wiki/Abraham_de_Moivrehttp://en.wikipedia.org/wiki/The_Doctrine_of_Chanceshttp://en.wikipedia.org/wiki/The_Doctrine_of_Chanceshttp://en.wikipedia.org/wiki/The_Doctrine_of_Chanceshttp://en.wikipedia.org/wiki/The_Doctrine_of_Chanceshttp://en.wikipedia.org/wiki/Law_of_large_numbershttp://en.wikipedia.org/wiki/Law_of_large_numbershttp://en.wikipedia.org/wiki/Carl_Friedrich_Gausshttp://en.wikipedia.org/wiki/Cereshttp://en.wikipedia.org/wiki/Theory_of_errorshttp://en.wikipedia.org/wiki/Theory_of_errorshttp://en.wikipedia.org/wiki/Cereshttp://en.wikipedia.org/wiki/Carl_Friedrich_Gausshttp://en.wikipedia.org/wiki/Law_of_large_numbershttp://en.wikipedia.org/wiki/Law_of_large_numbershttp://en.wikipedia.org/wiki/The_Doctrine_of_Chanceshttp://en.wikipedia.org/wiki/The_Doctrine_of_Chanceshttp://en.wikipedia.org/wiki/Abraham_de_Moivrehttp://en.wikipedia.org/wiki/Ars_Conjectandihttp://en.wikipedia.org/wiki/Jacob_Bernoullihttp://en.wikipedia.org/wiki/Christiaan_Huygenshttp://en.wikipedia.org/wiki/Blaise_Pascalhttp://en.wikipedia.org/wiki/Pierre_de_Fermathttp://en.wikipedia.org/wiki/Insurancehttp://en.wikipedia.org/wiki/Renaissancehttp://en.wikipedia.org/wiki/Half-proofhttp://en.wikipedia.org/wiki/Law_of_evidencehttp://en.wikipedia.org/wiki/Cicero
  • 7/26/2019 Add Math Project Work 2016

    2/12

    method of least squares to correct error-prone observations, especially in astronomy, based on theassumption of anormal distribution of errors to determine the most likely true value.Towards the end of the nineteenth century, a major success of explanation in terms ofprobabilities was theStatistical mechanics ofLudwig Boltzmann andJ. Willard Gibbs whichexplained properties of gases such as temperature in terms of the random motions of large

    numbers of particles.

    The field of the history of probability itself was established byIsaac Todhunter's monumentalHistory of the Mathematical Theory of Probability from the Time of Pascal to that of Lagrange(1865). Probability and statistics became closely connected through the work onhypothesistesting ofR. A. Fisher andJerzy Neyman,which is now widely applied in biological andpsychological experiments and inclinical trials of drugs. A hypothesis, for example that a drug isusually effective, gives rise to aprobability distribution that would be observed if the hypothesisis true. If observations approximately agree with the hypothesis, it is confirmed, if not, thehypothesis is rejected.

    The theory of stochastic processes broadened into such areas asMarkov processes andBrownianmotion,the random movement of tiny particles suspended in a fluid. That provided a model forthe study of random fluctuations in stock markets, leading to the use of sophisticated probabilitymodels inmathematical finance,including such successes as the widely-usedBlack-Scholesformula for thevaluation of options.

    The twentieth century also saw long-running disputes on theinterpretations of probability.In themid-centuryfrequentism was dominant, holding that probability means long-run relativefrequency in a large number of trials. At the end of the century there was some revival of theBayesian view, according to which the fundamental notion of probability is how well aproposition is supported by the evidence for it.

    APPLICATIONSTwo major applications of probability theory in everyday life are inrisk assessment and in tradeoncommodity markets.Governments typically apply probabilistic methods inenvironmentalregulation where it is called "pathway analysis", oftenmeasuring well-being using methods thatarestochastic in nature, and choosing projects to undertake based on statistical analyses of theirprobable effect on the population as a whole.

    A good example is the effect of the perceived probability of any widespread Middle East conflicton oil prices - which have ripple effects in the economy as a whole. An assessment by acommodity trader that a war is more likely vs. less likely sends prices up or down, and signalsother traders of that opinion. Accordingly, the probabilities are not assessed independently nornecessarily very rationally. The theory ofbehavioral finance emerged to describe the effect ofsuchgroupthink on pricing, on policy, and on peace and conflict.

    It can reasonably be said that the discovery of rigorous methods to assess and combineprobability assessments has had a profound effect on modern society. Accordingly, it may be ofsome importance to most citizens to understand how odds and probability assessments are made,

    and how they contribute to reputations and to decisions, especially in a democracy.

    http://en.wikipedia.org/wiki/Method_of_least_squareshttp://en.wikipedia.org/wiki/Normal_distributionhttp://en.wikipedia.org/wiki/Statistical_mechanicshttp://en.wikipedia.org/wiki/Ludwig_Boltzmannhttp://en.wikipedia.org/wiki/J._Willard_Gibbshttp://en.wikipedia.org/wiki/Isaac_Todhunterhttp://en.wikipedia.org/wiki/Statistical_hypothesis_testinghttp://en.wikipedia.org/wiki/Statistical_hypothesis_testinghttp://en.wikipedia.org/wiki/Ronald_Fisherhttp://en.wikipedia.org/wiki/Jerzy_Neymanhttp://en.wikipedia.org/wiki/Clinical_trialshttp://en.wikipedia.org/wiki/Probability_distributionhttp://en.wikipedia.org/wiki/Markov_processhttp://en.wikipedia.org/wiki/Brownian_motionhttp://en.wikipedia.org/wiki/Brownian_motionhttp://en.wikipedia.org/wiki/Brownian_motionhttp://en.wikipedia.org/wiki/Mathematical_financehttp://en.wikipedia.org/wiki/Black-Scholeshttp://en.wikipedia.org/wiki/Valuation_of_optionshttp://en.wikipedia.org/wiki/Probability_interpretationshttp://en.wikipedia.org/wiki/Frequency_probabilityhttp://en.wikipedia.org/wiki/Bayesian_probabilityhttp://en.wikipedia.org/wiki/Riskhttp://en.wikipedia.org/wiki/Commodity_marketshttp://en.wikipedia.org/wiki/Environmental_regulationhttp://en.wikipedia.org/wiki/Environmental_regulationhttp://en.wikipedia.org/w/index.php?title=Pathway_analysis&action=edit&redlink=1http://en.wikipedia.org/wiki/Measuring_well-beinghttp://en.wikipedia.org/wiki/Stochastichttp://en.wikipedia.org/wiki/Behavioral_financehttp://en.wikipedia.org/wiki/Groupthinkhttp://en.wikipedia.org/wiki/Democracyhttp://en.wikipedia.org/wiki/Democracyhttp://en.wikipedia.org/wiki/Groupthinkhttp://en.wikipedia.org/wiki/Behavioral_financehttp://en.wikipedia.org/wiki/Stochastichttp://en.wikipedia.org/wiki/Measuring_well-beinghttp://en.wikipedia.org/w/index.php?title=Pathway_analysis&action=edit&redlink=1http://en.wikipedia.org/wiki/Environmental_regulationhttp://en.wikipedia.org/wiki/Environmental_regulationhttp://en.wikipedia.org/wiki/Commodity_marketshttp://en.wikipedia.org/wiki/Riskhttp://en.wikipedia.org/wiki/Bayesian_probabilityhttp://en.wikipedia.org/wiki/Frequency_probabilityhttp://en.wikipedia.org/wiki/Probability_interpretationshttp://en.wikipedia.org/wiki/Valuation_of_optionshttp://en.wikipedia.org/wiki/Black-Scholeshttp://en.wikipedia.org/wiki/Mathematical_financehttp://en.wikipedia.org/wiki/Brownian_motionhttp://en.wikipedia.org/wiki/Brownian_motionhttp://en.wikipedia.org/wiki/Markov_processhttp://en.wikipedia.org/wiki/Probability_distributionhttp://en.wikipedia.org/wiki/Clinical_trialshttp://en.wikipedia.org/wiki/Jerzy_Neymanhttp://en.wikipedia.org/wiki/Ronald_Fisherhttp://en.wikipedia.org/wiki/Statistical_hypothesis_testinghttp://en.wikipedia.org/wiki/Statistical_hypothesis_testinghttp://en.wikipedia.org/wiki/Isaac_Todhunterhttp://en.wikipedia.org/wiki/J._Willard_Gibbshttp://en.wikipedia.org/wiki/Ludwig_Boltzmannhttp://en.wikipedia.org/wiki/Statistical_mechanicshttp://en.wikipedia.org/wiki/Normal_distributionhttp://en.wikipedia.org/wiki/Method_of_least_squares
  • 7/26/2019 Add Math Project Work 2016

    3/12

    Another significant application of probability theory in everyday life isreliability.Manyconsumer products, such asautomobiles and consumer electronics, utilizereliability theory in thedesign of the product in order to reduce the probability of failure. The probability of failure maybe closely associated with the product'swarranty.

    b)Empirical Probabilityof an event is an "estimate" that the event will happen based on howoften the event occurs after collecting data or running an experiment (in a large number oftrials). It is based specifically on direct observations or experiences.

    Empirical Probability Formula

    P(E) = probability that an event,E, will occur.Top= number of ways the specific event occurs.

    Bottom = number of ways the experiment could

    occur.

    Example: A survey was conducted to determ

    students' favorite breeds of dogs. Each studen

    chose only one breed.

    DogCollieSpanielLabBoxerPit-

    bullOthe

    # 10 15 35 8 5 12

    What is the probability that a student's favorit

    dog breed is Lab?

    Answer: 35 out of the 85 students chose

    Lab. The probability is .

    Theoretical Probabilityof an event is the number of ways that the event can occur, divided by

    the total number of outcomes. It is finding the probability of events that come from a sample

    space of known equally likely outcomes.

    Theoretical Probability Formula

    P(E) = probability that an event,E, will occur.

    n(E)= number of equally likely outcomes ofE.

    n(S) = number of equally likely outcomes of sample

    space S.

    Example 1: Find the probability of rolling a

    on a fair die.

    Answer: The sample space for rolling is die i

    equally likely results: {1, 2, 3, 4, 5, 6}.The probability of rolling a 6 is one out of 6 o

    Example 2: Find the probability of tossing a fair die and getting an odd number.

    Answer:eventE: tossing an odd number

    outcomes inE: {1, 3, 5}

    sample space S: {1, 2, 3, 4, 5, 6}

    http://en.wikipedia.org/wiki/Reliability_theory_of_aging_and_longevityhttp://en.wikipedia.org/wiki/Automobileshttp://en.wikipedia.org/wiki/Reliability_theoryhttp://en.wikipedia.org/wiki/Warrantyhttp://en.wikipedia.org/wiki/Warrantyhttp://en.wikipedia.org/wiki/Reliability_theoryhttp://en.wikipedia.org/wiki/Automobileshttp://en.wikipedia.org/wiki/Reliability_theory_of_aging_and_longevity
  • 7/26/2019 Add Math Project Work 2016

    4/12

    Comparing Theoretical Probability and Empirical Probability

    Karen and Jason roll two dice 50 times and record their results in

    the accompanying chart.

    1.) What is their empirical probability of rolling a 7?

    2.) What is the theoretical probability of rolling a 7?

    3.) How do the empirical and theoretical probabilities compare?

    Sum of the rolls of twodice

    3, 5, 5, 4, 6, 7, 7, 5, 9, 10,12, 9, 6, 5, 7, 8, 7, 4, 11,

    6,

    8, 8, 10, 6, 7, 4, 4, 5, 7, 9,9, 7, 8, 11, 6, 5, 4, 7, 7, 4,

    3, 6, 7, 7, 7, 8, 6, 7, 8, 9

    Solution:

    1.) Empirical probability (experimental probability or observedprobability) is 13/50 = 26%.

    2.) Theoretical probability (based upon what is possible whenworking with two dice) = 6/36 = 1/6 = 16.7% (check out the tableat the right of possible sums when rolling two dice).

    3.) Karen and Jason rolled more 7's than would be expected

    theoretically.

  • 7/26/2019 Add Math Project Work 2016

    5/12

    PART 2

    a) There are three player, considered as P1,P2, and P3. The total side of the die which is cube issix, and the number of dots on the dice is 1, 2, 3, 4, 5and 6 respectively.

    Thus, the possible outcomes are:{1,2,3,4,5,6}

    b) When we tossed two dice simultaneously, the possible outcomes is as shown in the table

    below

    DIE 1DIE 2

    1 2 3 4 5 6

    1 (1,1) (1,2) (1,3) (1,4) (1,5) (1,6)

    2 (2,1) (2,2) (2,3) (2,4) (2,5) (2,6)

    3 (3,1) (3,2) (3,3) (3,4) (3,5) (3,6)

    4 (4,1) (4,2) (4,3) (4,4) (4,5) (4,6)

    5 (5,1) (5,2) (5,3) (5,4) (5,5) (5,6)

    6 (6,1) (6,2) (6,3) (6,4) (6,5) (6,6)

    PART 3

    a)

    Table 1Sum of dots on both turned-

    up faces (x)Possible outcomes Probability,P(x)

    2 (1,1) 136

    3 (1,2),(2,1) 236

    4 (1,3),(2,2),(3,1) 336

    5 (1,4),(2,3),(3,2),(4,1)

    4366 (1,5),(2,4),(3,3),(4,2),(5,1) 5

    367 (1,6),(2,5),(3,4),(4,3),(5,2),(6,1) 6

    368 (2,6),(3,5),(4,4),(5,3),(6,2) 5

    36

  • 7/26/2019 Add Math Project Work 2016

    6/12

    9 (3,6),(4,5),(6,3),(5,4) 436

    10 (4,6),(5,5),(6,4) 336

    11 (5,6),(6,5) 23612 (6,6) 136

    b) A = {the two numbers are the same}

    = {(1,1),(2,2),(3,3),(4,4), (5,5),(6,6)}

    P (A) =()()

    =

    =

    B = {the product of the two numbers is greater than 25}

    = {(5,5),(5,6),(6,5),(6,6)}

    P (B) =()()=

    =

    C = {Both numbers are prime or the difference between two numbers is even)

    ={(2,2),(2,3),(2,5),(3,2),(3,3),(3,5),(5,2),(5,3),(5,5)}{(1,3),(1,5),(2,4),(2,6),(3,1),(3,5),(4,2),(4,6),(5,1),(5,3),(6,2),(6,4),}

    P (C) =()()

    =

    +

    =

  • 7/26/2019 Add Math Project Work 2016

    7/12

    D = {The sum of the two numbers are odd and both numbers are perfect square)

    = {(1,2),(1,4),(1,6), (2,1),(2,3),(2,5),(3,2),(3,4),(3,6),(4,1),(4,3),(4,5),

    (5,2),(5,4),(5,6),(6,1),(6,3),(6,5)} {(1,1),(1,4),(4,1),(4,4) }P (D) = P(Odd) + P(Perfect Square)P(Odd and Square)

    P (D) = 36

    =1

    18

    PART 4

    Sum of the two numbers (x) Frequency(f) fx fx

    2 2 4 83 5 15 45

    4 5 20 80

    5 1 5 25

    6 4 24 144

    7 8 56 392

    8 8 64 512

    9 5 45 405

    10 7 70 700

    11 3 33 363

    12 2 24 288

    Total 50 360 2962

    .Mean = 36050

    = 7.2

    Variance = (296250) - 7.2= 7.4

    Standard Deviation = 7.4= 2.72

    b) When the number of tossed of the two dice simultaneously is increase to 100 times, the valueof mean isalso change.

  • 7/26/2019 Add Math Project Work 2016

    8/12

    c)x f fx 2 6 12 24

    3 9 27 814 11 44 1765 12 60 3006 13 78 4687 10 70 4908 7 56 4489 12 108 97210 7 70 70011 6 66 72612 7 84 1008

    =100 =675 =5393mean =

    = = 6.75

    Variance =

    =

    6.75

    =8.3675

    Standard deviation = =2.8927

    Write your own comments about the prediction proven

  • 7/26/2019 Add Math Project Work 2016

    9/12

    When two dice are tossed simultaneously, the actual mean and variance of the sum of all dots on

    the turned-up faces can be determined by using the formulae below:

    a) Based on Table 1, the actual mean, the variance and the standard deviation of the sum of

    all dots on the turned-up faces are determined by using the formulae given.

    x P(x) x P(x) P(x)

    2 4 136 118 193 9

    236

    16

    12

    4 163

    3613

    43

    5 254

    3659

    259

    6 36 536 56 57 49

    636

    76

    496

    8 645

    36109

    1603

    9 814

    36 1 9

    10 100

    336

    56

    253

    11 1212

    361118

    12118

    12 1441

    3613 4

    80__9

  • 7/26/2019 Add Math Project Work 2016

    10/12

    Mean = 7

    Variance =

    7= 50.2778

    Standard deviation=50.2778=7.0907

    b)

    Table below shows the comparison of mean, variance and standard deviation of part 4 and part 5.

    PART 4 PART 5

    n=50 n=100

    Mean 7.2 6.75 7.00

    Variance 7.4 8.3675 50.2778

    Standard Deviation 2.72 2.89266 7.090682462

    We can see that, the mean, variance and standard deviation that we obtained through experiment

    in part 4 are different but close to the theoretical value in part 5.

    For mean, when the number of trial increased from n=50 to n=100, its value get closer (from 7.2

    to 6.75) to the theoretical value. This is in accordance to the Law of Large Number.

    Nevertheless, the empirical variance and empirical standard deviation that we obtained part 4 get

    further from the theoretical value in part 5. This violates the Law of Large Number. This is

    probably due to

    a. The sample (n=100) is not large enough to see the change of value of mean, variance andstandard deviation.

    b. Law of Large Number is not an absolute law. Violation of this law is still possible though

    the probability is relative low.

    In conclusion, the empirical mean, variance and standard deviation can be different from the

    theoretical value. When the number of trial (number of sample) getting bigger, the empirical

    value should get closer to the theoretical value. However, violation of this rule is still possible,

    especially when the number of trial (or sample) is not large enough.

    329

    6

    5.8333

    5.8333

    2.4152

    5.8333

    2.4152

  • 7/26/2019 Add Math Project Work 2016

    11/12

    c) The range of the mean

    6 7.2

    Conjecture: As the number of toss, n, increases, the mean will get closer to 7. 7 is the theoretical

    mean.

    Image below support this conjecture where we can see that, after 500 toss, the theoretical mean

    become very close to the theoretical mean, which is 3.5. (Take note that this is experiment of

    tossing 1 die, but not 2 dice as what we do in our experiment)

    http://upload.wikimedia.org/wikipedia/commons/f/f9/Largenumbers.svg
  • 7/26/2019 Add Math Project Work 2016

    12/12

    FURTHER EXPLORATION

    Inprobability theory,the law of large numbers (LLN) is atheorem that describes the result of

    performing the same experiment a large number of times. According to the law, theaverage of

    the results obtained from a large number of trials should be close to theexpected value,and willtend to become closer as more trials are performed.

    For example, a single roll of a six-sideddieproduces one of the numbers 1, 2, 3, 4, 5, 6, eachwith equalprobability.Therefore, the expected value of a single die roll is

    1 + 2 + 3 + 4 + 5 + 66 = 3.5

    According to the law of large numbers, if a large number of dice are rolled, the average of their

    values (sometimes called thesample mean)is likely to be close to 3.5, with the accuracyincreasing as more dice are rolled.

    Similarly, when afair coin is flipped once, the expected value of the number of heads is equal to

    one half. Therefore, according to the law of large numbers, the proportion of heads in a largenumber of coin flips should be roughly one half. In particular, the proportion of heads after n

    flips willalmost surelyconverge to one half as napproaches infinity.

    Though the proportion of heads (and tails) approaches half,almost surely the absolute (nominal)

    difference in the number of heads and tails will become large as the number of flips becomes

    large. That is, the probability that the absolute difference is a small number approaches zero as

    number of flips becomes large. Also, almost surely the ratio of the absolute difference to numberof flips will approach zero. Intuitively, expected absolute difference grows, but at a slower rate

    than the number of flips, as the number of flips grows.

    The LLN is important because it "guarantees" stable long-term results for random events. For

    example, while a casino may lose money in a single spin of theroulette wheel, its earnings will

    tend towards a predictable percentage over a large number of spins. Any winning streak by aplayer will eventually be overcome by the parameters of the game. It is important to remember

    that the LLN only applies (as the name indicates) when a large numberof observations are

    considered. There is no principle that a small number of observations will converge to the

    expected value or that a streak of one value will immediately be "balanced" by the others. SeetheGambler's fallacy.

    http://en.wikipedia.org/wiki/Probability_theoryhttp://en.wikipedia.org/wiki/Theoremhttp://en.wikipedia.org/wiki/Averagehttp://en.wikipedia.org/wiki/Expected_valuehttp://en.wikipedia.org/wiki/Dicehttp://en.wikipedia.org/wiki/Probabilityhttp://en.wikipedia.org/wiki/Sample_meanhttp://en.wikipedia.org/wiki/Fair_coinhttp://en.wikipedia.org/wiki/Almost_surelyhttp://en.wikipedia.org/wiki/Limit_of_a_sequencehttp://en.wikipedia.org/wiki/Almost_surelyhttp://en.wikipedia.org/wiki/Roulettehttp://en.wikipedia.org/wiki/Gambler%27s_fallacyhttp://en.wikipedia.org/wiki/Gambler%27s_fallacyhttp://en.wikipedia.org/wiki/Roulettehttp://en.wikipedia.org/wiki/Almost_surelyhttp://en.wikipedia.org/wiki/Limit_of_a_sequencehttp://en.wikipedia.org/wiki/Almost_surelyhttp://en.wikipedia.org/wiki/Fair_coinhttp://en.wikipedia.org/wiki/Sample_meanhttp://en.wikipedia.org/wiki/Probabilityhttp://en.wikipedia.org/wiki/Dicehttp://en.wikipedia.org/wiki/Expected_valuehttp://en.wikipedia.org/wiki/Averagehttp://en.wikipedia.org/wiki/Theoremhttp://en.wikipedia.org/wiki/Probability_theory