Asymptotic notation

This theory bit is to recall the definition and basic properties of Big-Oh, Big-Omega, Big-Theta notation. To know if a function f is O(g) (or \Theta(g) or \Omega(g)) typically reduces to computing a limit. You might need to refresh how to compute limits (e.g. de L’Hôpital rule). See the theory bit on limits.

In this page we assume f,g to be functions from \mathbb R \to \mathbb R.

On the Big-Oh notation

  • f grows asymptotically slower (or equal) than g
  • f=O(g)
  • f is Big-Oh of g

have the same meaning and are interchangeable. Formally, f is a Big-Oh of g if for all large enough x, f is at most proportional to a multiple of g. That is f=O(g) if \exists x_0\in \mathbb R_+\exists c\in \mathbb R_+\forall x\geq x_0\ f(x)\leq c\cdot g(x)\ .

In particular, if \displaystyle\lim_{x\to \infty}\frac{f(x)}{g(x)}< +\infty then f=O(g).

On the Big-Omega notation

  • f grows asymptotically faster (or equal) than g
  • f=\Omega(g)
  • f is Big-Omega of g

have the same meaning and are interchangeable. Formally, f is a Big-Omega of g if for all large enough x, f is at at least proportional to a multiple of g. That is f=\Omega(g) if \exists x_0\in \mathbb R_+\exists c\in \mathbb R_+\forall x\geq x_0\ f(x)\geq c\cdot g(x)\ .

In particular, if \displaystyle\lim_{x\to \infty}\frac{f(x)}{g(x)}> 0 then f=\Omega(g).

That is, if f=O(g), then g=\Omega(f). If f grows asymptotically slower than g, then g grows asymptotically faster than f.

On the Big-Theta notation

  • f has the same asymptotic growth of g
  • f=\Theta(g)
  • g=\Theta(f)
  • f is Big-Theta of g
  • g is Big-Theta of f

all have the same meaning and are interchangeable. Formally, f is Big-Theta of g if it is both Big-Oh and Big-Omega of g. That is f=\Theta(g) if \exists x_0\in \mathbb R_+\exists c_1,c_2\in \mathbb R_+\forall x\geq x_0\ c_1\cdot g(x)\leq f(x)\leq c_2\cdot g(x)\ .

In particular, if \displaystyle\lim_{x\to \infty}\frac{f(x)}{g(x)} = c with 0< c<+\infty, then f=\Theta(g).

Common errors/confusion

f=\Theta(g), or f=\Omega(g) or f=O(g) are sentences about the asymptotic growth rate of the function f in comparison with the one of g. Although f or g might be connected to algorithms (e.g. some running times), there is no reference to an algorithm implicit in the notation. In particular the \Theta, O and \Omega notations are not connected at all to the worst/best/average cases of the running time of algorithms.

Sometimes the practical use of the O and the \Theta notation overlap. At times people use O(\cdot) when they actually mean to use \Theta(\cdot). Recall that the O(\cdot) notation denotes only an upper-bound on the asymptotic growth, while \Theta(\cdot) indicates the exact asymptotic growth rate of a function. In EDA we don’t want to do this, we try to be as precise as possible.

Example

The sentences

  • the Insertion Sort algorithm in the worst case runs in time O(2^n), and
  • the Insertion Sort algorithm in the worst case runs in time in time \Theta(n^2)

(where n is the size of the input) are both technically correct sentences, but the first one is not very interesting. The first one is so little informative that actually in EDA exams might be even counted as incorrect.

Example

The figure below is taken from the internet and summarizes visually some growth orders of functions. It is a mostly correct visual aid but not quite completely correct. Can you spot the error(s)?

The figure mixes n and x for the name of the variable. This is an easy fix, but there is another error. It would have been correct if each O(\cdot) were a \Theta(\cdot).