You are on page 1of 12

y

KERJA PROJEK MATEMATIK TAMBAHAN 2016


NEGERI P.PINANG (TIMUR LAUT) - ANSWER
http://addmathsprojectwork.blogspot.my/

po
t.m

PART 1
a)

gs

INTRODUCTION
Probability is a way of expressing knowledge or belief that an event will occur or has occurred.
In mathematics the concept has been given an exact meaning in probability theory, that is used
extensively in such areas of study as mathematics, statistics, finance, gambling, science, and
philosophy to draw conclusions about the likelihood of potential events and the underlying
mechanics of complex systems.

or
k

.b

lo

Probability has a dual aspect: on the one hand the probability or likelihood of hypotheses given
the evidence for them and on the other hand the behavior of stochastic processes such as the
throwing of dice or coins. The study of the former is historically older in, for example, the law of
evidence, while the mathematical treatment of dice began with the work of Pascal and Fermat in
the 1650s. Probability is distinguished from statistics. While statistics deals with data and
inferences from it, (stochastic) probability deals with the stochastic (random) processes which lie
behind data or outcomes.

hs
pr

oj

ec
t

HISTORY
Probable and likely and their cognates in other modern languages derive from medieval learned
Latin probabilis and verisimilis, deriving from Cicero and generally applied to an opinion to
mean plausible or generally approved.
Ancient and medieval law of evidence developed a grading of degrees of proof, probabilities,
presumptions and half-proof to deal with the uncertainties of evidence in court. In Renaissance
times, betting was discussed in terms of odds such as "ten to one" and maritime insurance
premiums were estimated based on intuitive risks, but there was no theory on how to calculate
such odds or premiums.

ad

dm

at

The mathematical methods of probability arose in the correspondence of Pierre de Fermat and
Blas Pascal (1654) on such questions as the fair division of the stake in an interrupted game of
chance. Christiaan Huygens (1657) gave a comprehensive treatment of the subject.
Jacob Bernoulli's Ars Conjectandi (posthumous, 1713) and Abraham de Moivre's The Doctrine
of Chances (1718) put probability on a sound mathematical footing, showing how to calculate a
wide range of complex probabilities. Bernoulli proved a version of the fundamental law of large
numbers, which states that in a large number of trials, the average of the outcomes is likely to be
very close to the expected value - for example, in 1000 throws of a fair coin, it is likely that there
are close to 500 heads (and the larger the number of throws, the closer to half-and-half the
proportion is likely to be).
The power of probabilistic methods in dealing with uncertainty was shown by Gauss's
determination of the orbit of Ceres from a few observations. The theory of errors used the

po
t.m

method of least squares to correct error-prone observations, especially in astronomy, based on the
assumption of a normal distribution of errors to determine the most likely true value.
Towards the end of the nineteenth century, a major success of explanation in terms of
probabilities was the Statistical mechanics of Ludwig Boltzmann and J. Willard Gibbs which
explained properties of gases such as temperature in terms of the random motions of large
numbers of particles.

lo

gs

The field of the history of probability itself was established by Isaac Todhunter's monumental
History of the Mathematical Theory of Probability from the Time of Pascal to that of Lagrange
(1865). Probability and statistics became closely connected through the work on hypothesis
testing of R. A. Fisher and Jerzy Neyman, which is now widely applied in biological and
psychological experiments and in clinical trials of drugs. A hypothesis, for example that a drug is
usually effective, gives rise to a probability distribution that would be observed if the hypothesis
is true. If observations approximately agree with the hypothesis, it is confirmed, if not, the
hypothesis is rejected.

or
k

.b

The theory of stochastic processes broadened into such areas as Markov processes and Brownian
motion, the random movement of tiny particles suspended in a fluid. That provided a model for
the study of random fluctuations in stock markets, leading to the use of sophisticated probability
models in mathematical finance, including such successes as the widely-used Black-Scholes
formula for the valuation of options.

ec
t

The twentieth century also saw long-running disputes on the interpretations of probability. In the
mid-century frequentism was dominant, holding that probability means long-run relative
frequency in a large number of trials. At the end of the century there was some revival of the
Bayesian view, according to which the fundamental notion of probability is how well a
proposition is supported by the evidence for it.

hs
pr

oj

APPLICATIONS
Two major applications of probability theory in everyday life are in risk assessment and in trade
on commodity markets. Governments typically apply probabilistic methods in environmental
regulation where it is called "pathway analysis", often measuring well-being using methods that
are stochastic in nature, and choosing projects to undertake based on statistical analyses of their
probable effect on the population as a whole.

ad

dm

at

A good example is the effect of the perceived probability of any widespread Middle East conflict
on oil prices - which have ripple effects in the economy as a whole. An assessment by a
commodity trader that a war is more likely vs. less likely sends prices up or down, and signals
other traders of that opinion. Accordingly, the probabilities are not assessed independently nor
necessarily very rationally. The theory of behavioral finance emerged to describe the effect of
such groupthink on pricing, on policy, and on peace and conflict.
It can reasonably be said that the discovery of rigorous methods to assess and combine
probability assessments has had a profound effect on modern society. Accordingly, it may be of
some importance to most citizens to understand how odds and probability assessments are made,
and how they contribute to reputations and to decisions, especially in a democracy.

Another significant application of probability theory in everyday life is reliability. Many


consumer products, such as automobiles and consumer electronics, utilize reliability theory in the
design of the product in order to reduce the probability of failure. The probability of failure may
be closely associated with the product's warranty.

po
t.m

b)
Empirical Probability of an event is an "estimate" that the event will happen based on how
often the event occurs after collecting data or running an experiment (in a large number of
trials). It is based specifically on direct observations or experiences.

Example: A survey was conducted to determine


students' favorite breeds of dogs. Each student
chose only one breed.
PitDog Collie Spaniel Lab Boxer
Other
bull
10
15
35
8
5
12
#

.b

What is the probability that a student's favorite


dog breed is Lab?
Answer: 35 out of the 85 students chose
Lab. The probability is

ec
t

or
k

P (E) = probability that an event, E, will occur.


Top = number of ways the specific event occurs.
Bottom = number of ways the experiment could
occur.

lo

gs

Empirical Probability Formula

oj

Theoretical Probability of an event is the number of ways that the event can occur, divided by
the total number of outcomes. It is finding the probability of events that come from a sample
space of known equally likely outcomes.

hs
pr

Theoretical Probability Formula

dm

at

P (E) = probability that an event, E, will occur.


n(E) = number of equally likely outcomes of E.
n(S) = number of equally likely outcomes of sample
space S.

Example 1: Find the probability of rolling a six


on a fair die.
Answer: The sample space for rolling is die is 6
equally likely results: {1, 2, 3, 4, 5, 6}.
The probability of rolling a 6 is one out of 6 or

ad

Example 2: Find the probability of tossing a fair die and getting an odd number.
Answer:
event E : tossing an odd number
outcomes in E: {1, 3, 5}
sample space S: {1, 2, 3, 4, 5, 6}

Comparing Theoretical Probability and Empirical Probability

ad

dm

at

hs
pr

oj

ec
t

or
k

.b

po
t.m

lo

Solution:
1.) Empirical probability (experimental probability or observed
probability) is 13/50 = 26%.
2.) Theoretical probability (based upon what is possible when
working with two dice) = 6/36 = 1/6 = 16.7% (check out the table
at the right of possible sums when rolling two dice).
3.) Karen and Jason rolled more 7's than would be expected
theoretically.

3, 5, 5, 4, 6, 7, 7, 5, 9, 10,
12, 9, 6, 5, 7, 8, 7, 4, 11,
6,
8, 8, 10, 6, 7, 4, 4, 5, 7, 9,
9, 7, 8, 11, 6, 5, 4, 7, 7, 4,
3, 6, 7, 7, 7, 8, 6, 7, 8, 9

gs

Karen and Jason roll two dice 50 times and record their results in
the accompanying chart.
1.) What is their empirical probability of rolling a 7?
2.) What is the theoretical probability of rolling a 7?
3.) How do the empirical and theoretical probabilities compare?

Sum of the rolls of two


dice

PART 2

a) There are three player, considered as P1,P2, and P3. The total side of the die which is cube is
six, and the number of dots on the dice is 1, 2, 3, 4, 5and 6 respectively.

po
t.m

Thus, the possible outcomes are:{1,2,3,4,5,6}

b) When we tossed two dice simultaneously, the possible outcomes is as shown in the table
below
2

(1,1)

(1,2)

(1,3)

(1,4)

(1,5)

(1,6)

(2,1)

(2,2)

(2,3)

(2,4)

(2,5)

(2,6)

(3,1)

(3,2)

(3,3)

(3,4)

(3,5)

(4,1)

(4,2)

(4,3)

(4,4)

(4,5)

(4,6)

(5,1)

(5,2)

(5,3)

(5,4)

(5,5)

(5,6)

(6,1)

(6,2)

(6,3)

(6,5)

(6,6)

lo

gs

.b

or
k

(6,4)

PART 3

Table 1
Possible outcomes

oj

a)

hs
pr

Sum of dots on both turnedup faces (x)


2

dm

at

ad

(3,6)

ec
t

DIE 1
DIE 2

(1,1)
(1,2),(2,1)

(1,3),(2,2),(3,1)

(1,4),(2,3),(3,2),(4,1)

(1,5),(2,4),(3,3),(4,2),(5,1)

(1,6),(2,5),(3,4),(4,3),(5,2),(6,1)

(2,6),(3,5),(4,4),(5,3),(6,2)

Probability, P(x)
1
36
2
36
3
36
4
36
5
36
6
36
5
36

10

(4,6),(5,5),(6,4)

11

(5,6),(6,5)

12

(6,6)

or
k

.b

lo

gs

b) A = {the two numbers are the same}


= {(1,1),(2,2),(3,3),(4,4), (5,5),(6,6)}
()
P (A) =
()
6
=
36
1
=
6

4
36
3
36
2
36
1
36

ec
t

B = {the product of the two numbers is greater than 25}


= {(5,5),(5,6),(6,5),(6,6)}
()
P (B) = ()
4
=
36
1

oj

hs
pr

C = {Both numbers are prime or the difference between two numbers is even)

ad

dm

at

={(2,2),(2,3),(2,5),(3,2),(3,3),(3,5),(5,2),(5,3),(5,5)}
{(1,3),(1,5),(2,4),(2,6),(3,1),(3,5),(4,2),(4,6),(5,1),(5,3),(6,2),(6,4),}
()
P (C) =
()
9
12
= +
36 36

7
12

(3,6),(4,5),(6,3),(5,4)

po
t.m

D = {The sum of the two numbers are odd and both numbers are perfect square)
= {(1,2),(1,4),(1,6), (2,1),(2,3),(2,5),(3,2),(3,4),(3,6),(4,1),(4,3),(4,5),

po
t.m

(5,2),(5,4),(5,6),(6,1),(6,3),(6,5)} {(1,1),(1,4),(4,1),(4,4) }
P (D) = P(Odd) + P(Perfect Square) P(Odd and Square)
2
P (D) =
36
1
18

fx

lo

Frequency(f)

.b

Sum of the two numbers (x)

gs

PART 4

fx
8
45

20

80

25

24

144

56

392

64

512

45

405

70

700

33

363

12

24

288

Total

50

360

2962

w
8

ec
t

7
8

hs
pr

11

oj

9
10

or
k

15

dm

at

.
Mean = 36050
= 7.2

ad

Variance = (296250) - 7.2


= 7.4
Standard Deviation = 7.4
= 2.72

b) When the number of tossed of the two dice simultaneously is increase to 100 times, the value
of mean is also change.

2
3
4
5
6
7
8
9
10
11
12

6
9
11
12
13
10
7
12
7
6
7
=100

12
27
44
60
78
70
56
108
70
66
84
=675

24
81
176
300
468
490
448
972
700
726
1008
2 =5393

.b

or
k

mean =

100

6.752

hs
pr

=8.3675

oj

5363

ec
t

Variance =

675
=
= 6.75
100

at

Standard deviation =

=2.8927

dm

Write your own comments about the prediction proven

ad

po
t.m

fx

gs

lo

c)

P(x)

1
36

2
36

16

25

36

2 P(x)
1
9

1
6

1
2

3
36

1
3

4
3

4
36

5
9

25
9

5
36

5
6

49

6
36

7
6

49
6

64

5
36

10
9

160
80
__
39

81

4
36

10

100

3
36

5
6

25
3

11

121

2
36

11
18

121
18

12

144

1
36

1
3

or
k
w
ec
t

oj

at
dm

.b

1
18

ad

x P(x)

lo

gs

Based on Table 1, the actual mean, the variance and the standard deviation of the sum of
all dots on the turned-up faces are determined by using the formulae given.

hs
pr

a)

po
t.m

When two dice are tossed simultaneously, the actual mean and variance of the sum of all dots on
the turned-up faces can be determined by using the formulae below:

Mean = 7
Variance =

1787
329

72

po
t.m

6
18
= 50.2778
5.8333

Standard deviation=50.2778
5.8333
=7.0907
2.4152

PART 5

n=100

Mean

7.2

6.75

7.00

Variance

7.4

8.3675

50.2778

Standard Deviation

2.72

2.89266

7.090682462

5.8333

2.4152

ec
t

or
k

lo

n=50

.b

PART 4

gs

b)
Table below shows the comparison of mean, variance and standard deviation of part 4 and part 5.

oj

We can see that, the mean, variance and standard deviation that we obtained through experiment
in part 4 are different but close to the theoretical value in part 5.

hs
pr

For mean, when the number of trial increased from n=50 to n=100, its value get closer (from 7.2
to 6.75) to the theoretical value. This is in accordance to the Law of Large Number.

at

Nevertheless, the empirical variance and empirical standard deviation that we obtained part 4 get
further from the theoretical value in part 5. This violates the Law of Large Number. This is
probably due to

dm

a. The sample (n=100) is not large enough to see the change of value of mean, variance and
standard deviation.
b. Law of Large Number is not an absolute law. Violation of this law is still possible though
the probability is relative low.

ad

In conclusion, the empirical mean, variance and standard deviation can be different from the
theoretical value. When the number of trial (number of sample) getting bigger, the empirical
value should get closer to the theoretical value. However, violation of this rule is still possible,
especially when the number of trial (or sample) is not large enough.

c) The range of the mean

po
t.m

6 7.2

Conjecture: As the number of toss, n, increases, the mean will get closer to 7. 7 is the theoretical
mean.

ad

dm

at

hs
pr

oj

ec
t

or
k

.b

lo

gs

Image below support this conjecture where we can see that, after 500 toss, the theoretical mean
become very close to the theoretical mean, which is 3.5. (Take note that this is experiment of
tossing 1 die, but not 2 dice as what we do in our experiment)

FURTHER EXPLORATION

po
t.m

In probability theory, the law of large numbers (LLN) is a theorem that describes the result of
performing the same experiment a large number of times. According to the law, the average of
the results obtained from a large number of trials should be close to the expected value, and will
tend to become closer as more trials are performed.
For example, a single roll of a six-sided die produces one of the numbers 1, 2, 3, 4, 5, 6, each
with equal probability. Therefore, the expected value of a single die roll is

lo

gs

1+2+3+4+5+6
= 3.5
6

.b

According to the law of large numbers, if a large number of dice are rolled, the average of their
values (sometimes called the sample mean) is likely to be close to 3.5, with the accuracy
increasing as more dice are rolled.

or
k

Similarly, when a fair coin is flipped once, the expected value of the number of heads is equal to
one half. Therefore, according to the law of large numbers, the proportion of heads in a large
number of coin flips should be roughly one half. In particular, the proportion of heads after n
flips will almost surely converge to one half as n approaches infinity.

hs
pr

oj

ec
t

Though the proportion of heads (and tails) approaches half, almost surely the absolute (nominal)
difference in the number of heads and tails will become large as the number of flips becomes
large. That is, the probability that the absolute difference is a small number approaches zero as
number of flips becomes large. Also, almost surely the ratio of the absolute difference to number
of flips will approach zero. Intuitively, expected absolute difference grows, but at a slower rate
than the number of flips, as the number of flips grows.

ad

dm

at

The LLN is important because it "guarantees" stable long-term results for random events. For
example, while a casino may lose money in a single spin of the roulette wheel, its earnings will
tend towards a predictable percentage over a large number of spins. Any winning streak by a
player will eventually be overcome by the parameters of the game. It is important to remember
that the LLN only applies (as the name indicates) when a large number of observations are
considered. There is no principle that a small number of observations will converge to the
expected value or that a streak of one value will immediately be "balanced" by the others. See
the Gambler's fallacy.

You might also like