MATH1231 - Probability


Collection of related objects, such that it's possible to determine for any given object whether or not it belongs in the set.
If every element of the set A is in the set B, then A is a subset of B.
Universal Set
The set of all elements (in any given context).
Empty Set
The set containing nothing. (Subset of any set).
The set containing everything not in the set A.
The set containing everything in A, and everything in B.
The set containing only what is in A and B.
The set of A - B is defined as the intersection of A and Bc
Mutually Exclusive / Disjoint
Two sets are mutually exclusive if they contain no elements. I.e. if the union of A and B is the Empty Set.
Sample Space
The set of all possible outcomes of a given experiment.
Sample Point / Outcome
Any one possible outcome in a sample space.
An event E is a subset of a sample space.
Statistically Independent
If P(A Intersection B) = P(A)P(B).
Random Variable
A real valued function defined on a sample space.
Probability Distribution (of a random variable X)
A description of all the probabilities of all events associated with X.
Binomial Distribution


A function performed on a sample space such that it fits four conditions:
  1. 0 <= P(A) <= 1
  2. P(∅) = 0
  3. P(S) = 1
  4. If A and B are mutually exclusive events then P(A Union B) = P(A) + P(B).

Addition Rule

Three rules that apply to all events:

  1. P(A Union B) = P(A) + P(B) - P(A Intersection B)
  2. P(Ac) = 1 - P(A)
  3. If A is a subset of B then P(A) <= P(B)

The Three Counting Rules

  1. If there are k experiments, each with n_i possible outcomes then the total number of possible outcomes for the k experiments becomes the product of each n_i multiplied together.
  2. The number of possible permutations of r objects selected from n distinct objects is $^nP_r = \frac{n!}{(n-r)!}$
  3. The number of ways of choosing r objects from n distinct objects is $^nC_r = \frac{n!}{r!(n-r)!}$

Conditional Probability

The probability of A given B is the probability of (A and B) over probability of B.

\begin{align} P(A|B) = \frac{P(A \cap B)}{P(B)} \end{align}

Total Probability Rule

If B is an event in S, and A_i is a partition of S then:

\begin{align} P(B) = \displaystyle\sum\limits_{i=1}^n \text{ } P(B|A_i)P(A_i) \end{align}

Bayes' Rule

If B is an event in S, and A_i is a partition of S then:

\begin{align} P(A_j|B) = \frac{P(B|A_j)P(A_j)}{\displaystyle\sum\limits_{i=1}^n \text{ } P(B|A_i)P(A_i)} \end{align}

Probability Distribution

The probability distribution for a discrete random variable X where ${X_k : k \in \mathbb{Z}}$ is a set ${p_k : k \in \mathbb{Z}}$ such that:

\begin{align} P(X = x_k) = p_k >= 0 \text { and } \displaystyle\sum\limits_{k = -\infty}^\infty \text { } p_k =1 \end{align}

I.e. the probability of any possible result of the function is included in the p set, at the corresponding location. And, of course, the sum of all the probabilities is 1 (by definition).

Expected Value

The expected value for X is shown as:

\begin{align} E(X) = \displaystyle\sum\limits_{k \in I} \text { } x_k p_k\\ \end{align}


The variance of X is shown as:

\begin{align} Var(X) = E((X - E(X))^2)\\ \text{ } = \displaystyle\sum\limits_{K \in I} \text { } (x_k - E(X))^2)p_k \text{ } = E(X^2) - E(X)^2 \end{align}

Standard Deviation

The standard deviation of X is shown as:

\begin{align} SD(X) = \sqrt{Var(X)} \end{align}

Binomial Distibution

If we have n trials and the probability of success for any trial is p and hence the probability of failure, q is 1 - p, then:

\begin{align} P(X = k) = p_k = B(n,p,k) = \begin{pmatrix} n\\ k\\ \end{pmatrix} p^k (1-p)^{n-k}, k = 0, 1, ..., n \end{align}

If X is a random variable with Binomial distribution B(n,p) then X:
a) has a mean of np
b) has variance of npq