(Ignoring the lecture notes because in this case they weren't helpful)

Algorithm complexity is important in order to get the balance of time efficiency, memory efficiency and simplicity.

Big O Notation, Theta Notation and Omega Notation are all Asymptotic bounds, meaning they deal with very very large inputs (as behaviour may change for low inputs, we concern ourselves solely with behaviour after a random magical point of input size).

Wikipedia Table for the visual amongst us

Where n0 is the point we're judging input as being big enough, T(n) as being the function itself and its actual running time and cg(n) as being our upper bound (i.e. O(cg(n)))

## Big O and Big Omega

Big O and Big Omega are complexity classes

- O contains all classes less than it, like O(n
^{2}) contains O(n) and O(logn) and O(1)- i.e. A constant time function will never be worse than n
^{2}

- i.e. A constant time function will never be worse than n
- Ω contains all classes greater than it, like Ω(n
^{2}) contains Ω(n^{3}) and Ω(n!)- i.e. A factorial function will never be better than n
^{2}

- i.e. A factorial function will never be better than n

## Tightness

That's where tightness comes in - saying everything is O(n!) and Ω(1) might be true, but it is useless. We want the most exact upper and lower bounds we can find.

## Big Theta

Theta is slightly different in that it's a more exact bounding. Θ(n) means that the time will always be better than k(n) and worse than l(n), where k and l are positive constants. (e.g. n^{2} is worse than 0.1n^{2} and better than 1000n^{2}).

## Most Common Complexities

In approximate order of fastest to slowest

O(1) | Constant Bound |

O(logn) | Log n Bound |

O(n) | Linear Bound |

O(nlogn) | n Log n Bound |

O(n^{x}) |
e.g. n^{2} or n^{3} Bound |

O(c^{n}) |
Exponential |

O(n!) | Factorial |

O(n^{n}) |
n to the power of n |

## Difference between O(1) and constant time

O(1) says that a function is bounded by a constant, but the individual input times may fluctuate beneath that.

Constant time means that a function runs at the same speed for all inputs.

## Constants and Lesser Complexities

We always ignore constants, as xn^{2} is always less than n^{3} for n > x. Hence if we pick our n0 as being bigger than x, n^{2} is significantly less than it.

We also ignore lower powers, for instance in n^{3} + n^{2} + n we ignore the n^{2} and n, as they are significantly smaller such that they have little impact.