COMP1927 - Complexity Analysis

(Ignoring the lecture notes because in this case they weren't helpful)

Algorithm complexity is important in order to get the balance of time efficiency, memory efficiency and simplicity.

Big O Notation, Theta Notation and Omega Notation are all Asymptotic bounds, meaning they deal with very very large inputs (as behaviour may change for low inputs, we concern ourselves solely with behaviour after a random magical point of input size).

Wikipedia Table for the visual amongst us

Complexity?action=AttachFile&do=get&target=bigo.png

Where n0 is the point we're judging input as being big enough, T(n) as being the function itself and its actual running time and cg(n) as being our upper bound (i.e. O(cg(n)))

Big O and Big Omega

Big O and Big Omega are complexity classes

  • O contains all classes less than it, like O(n2) contains O(n) and O(logn) and O(1)
    • i.e. A constant time function will never be worse than n2
  • Ω contains all classes greater than it, like Ω(n2) contains Ω(n3) and Ω(n!)
    • i.e. A factorial function will never be better than n2

Tightness

That's where tightness comes in - saying everything is O(n!) and Ω(1) might be true, but it is useless. We want the most exact upper and lower bounds we can find.

Big Theta

Theta is slightly different in that it's a more exact bounding. Θ(n) means that the time will always be better than k(n) and worse than l(n), where k and l are positive constants. (e.g. n2 is worse than 0.1n2 and better than 1000n2).

Most Common Complexities

In approximate order of fastest to slowest

O(1) Constant Bound
O(logn) Log n Bound
O(n) Linear Bound
O(nlogn) n Log n Bound
O(nx) e.g. n2 or n3 Bound
O(cn) Exponential
O(n!) Factorial
O(nn) n to the power of n

Difference between O(1) and constant time

O(1) says that a function is bounded by a constant, but the individual input times may fluctuate beneath that.
Constant time means that a function runs at the same speed for all inputs.

Constants and Lesser Complexities

We always ignore constants, as xn2 is always less than n3 for n > x. Hence if we pick our n0 as being bigger than x, n2 is significantly less than it.

We also ignore lower powers, for instance in n3 + n2 + n we ignore the n2 and n, as they are significantly smaller such that they have little impact.