Understanding other Assymptotic Notations

 


Understanding Big Omega (Ω), Big Theta (Θ), Small Oh (o) and Small Omega (ω) Notations


How to describe Big Omega(Ω)?

If run time of an algorithm is of Ω(g(n)), it means that the running time of the algorithm (as n gets larger) is at least proportional to g(n). Hence it helps in estimating a lower bound of the number of basic operations to be performed.

More specifically, f(x) = Ω(g(x)) (big-omega) means that the growth rate of f(x) is asymptotically greater than or equal to the growth rate of g(x)


Mathematically, a function f(x) is equal to Big Omega of another function g(x), i,e f(x) = Ω(g(x) is true if and only if there exist two constants (C1 and C2) such that


a) C1 and C2 are always positive
b) 0<= C1*g(n) <= f(n) for any value of n => C2

How to describe Big Theta (Θ)?

If run time of an algorithm is of Θ(g(n)), it means that the running time of the algorithm (as n gets larger) is equal to the growth rate of g(n). Hence it helps in estimating a tight bound of the number of basic operations to be performed.

Hence f(x) = Θ(g(x)) (big - theta) means that the growth rate of f(x) is asymptotically equal to the growth rate of g(x)

Mathematically, a function f(x) is equal to Big Theta of another function g(x), i,e f(x) = Θ(g(x) is true if and only if there exist three constants (C1 and C2 and C3) such that

a) C1, C2 and C3 are always positive
b) 0<= C1*g(n) <= f(n) <= C2*g(n) for any value of n => C3

What are Small Oh and Small Omega?

f(x) = o(g(x)) (small-oh) means that the growth rate of f(x) is asymptotically less than to the growth rate of g(x).

Mathematically, a function f(x) is equal to Small Oh of another function g(x), i,e f(x) = o(g(x) is true if and only if there exist two constants (C1 and C2) such that

a) C1 and C2 are always positive
b) 0<= f(n) <C1*g(n) for any value of n => C2

So this gives a loose upper bound for complexities of f(x).

On the other hand, f(x) = ω(g(x)) (small-omega) means that the growth rate of f(x) is asymptotically greater than the growth rate of g(x).

Mathematically, a function f(x) is equal to Small Omega of another function g(x), i,e f(x) = ω(g(x) is true if and only if there exist two constants (C1 and C2) such that

a) C1 and C2 are always positive
b) 0<= C1*g(n) < f(n) for any value of n => C2

So this gives a loose lower bound for complexities of f(x).