Asymptotics and Correctness
PhD Student
Spring 2018
An array of practice problems + homework help
But first I want to remind you of some definitions, and equivalent forms of those definitions.
Big-O
Given a function \(f(n)\) and \(g(n)\) we say that \(f(n)\) is \(O(g(n))\) if there exist two positive constants \(c_0\) and \(n_0\) such that
\(f(n) \leq c_0 g(n)\) for every \(n > n_0 \)
This is an asymptotic upper bound -- \(f(n)\) grows at most as fast as \(g(n)\)
This is often written as \(f(n) = O(g(n))\) or \(f(n) \in O(g(n))\)
Big-O (Limit Form)
Given a function \(f(n)\) and \(g(n)\) we say that \(f(n)\) is \(O(g(n))\) if there exists a constant \(0 \leq c < \infty\) such that
\(\lim_{n \to \infty}\frac{f(n)}{g(n)} = c \)
This definition is equivalent to the last statement, but it might be easier to use this form than the previous (and often it is).
Big-\(\Omega\)
Given a function \(f(n)\) and \(g(n)\) we say that \(f(n)\) is \(\Omega(g(n))\) if there exist positive two constants \(c_0\) and \(n_0\) such that
\(f(n) \geq c_0 g(n)\) for every \(n > n_0 \) .
This is an asymptotic lower bound -- \(f(n)\) grows no slower than \(g(n)\)
This is often written as \(f(n) = \Omega(g(n))\) or \(f(n) \in \Omega(g(n))\)
Big-\(\Omega\) (Limit Form)
Given a function \(f(n)\) and \(g(n)\) we say that \(f(n)\) is \(\Omega(g(n))\) if there exists a constant \(0 < c \leq \infty\) such that
\(\lim_{n \to \infty}\frac{f(n)}{g(n)} = c. \)
Again, this definition is equivalent to the last statement, but it might be easier to use this form than the previous (and often it is).
Notice the different conditions on \(c\) compared to the limit definition of \(O(g(n))\)
Big-\(\Theta\)
Given a function \(f(n)\) and \(g(n)\) we say that \(f(n)\) is \(\Omega(g(n))\) if there exist three positive constants \(c_0,c_1\) and \(n_0\) such that
\(c_0g(n) \leq f(n) \leq c_1 g(n)\) for every \(n > n_0 \) .
This is an asymptotic tight bound -- \(f(n)\) grows no slower and no faster than \(g(n)\)
This is often written as \(f(n) = \Theta(g(n))\) or \(f(n) \in \Theta(g(n))\)
Big-\(\Theta\) (Limit Form)
Given a function \(f(n)\) and \(g(n)\) we say that \(f(n)\) is \(\Theta(g(n))\) if there exists a constant \(0 < c < \infty\) such that
\(\lim_{n \to \infty}\frac{f(n)}{g(n)} = c. \)
Again, this definition is equivalent to the last statement, but it might be easier to use this form than the previous (and often it is).
Again, notice the different conditions on \(c\) compared to the limit definition of \(O(g(n))\) and \(\Omega(g(n))\)).
Little-o
Given a function \(f(n)\) and \(g(n)\) we say that \(f(n)\) is \(o(g(n))\) if, for every positive \(c\) there exists a constant \(n_0\) such that
\(f(n) < c \cdot g(n)\) for every \(n > n_0 \) .
The intuition for this is that \(g(n)\) grows strictly faster than \(f(n)\).
How is this different from \(O(g(n))\)?
Little-o (Limit Form)
Given a function \(f(n)\) and \(g(n)\) we say that \(f(n)\) is \(o(g(n))\) if
\(\lim_{n \to \infty}\frac{f(n)}{g(n)} = 0. \)
This is often written as \(f(n) = o(g(n))\) or \(f(n) \in o(g(n))\)
Summary
Notation
Relationship
Conditions
\(f(n) = O(g(n))\)
\(f(n) = o(g(n))\)
\(f(n) = \Omega(g(n))\)
\(f(n) = \Theta(g(n))\)
\(f(n) \leq c_0 g(n), \forall n>n_0 \)
\(\exists n_0, \exists c_0 > 0\)
\(f(n) < c_0 g(n), \forall n>n_0 \)
\(\exists n_0, \forall c_0 > 0\)
\(f(n) \geq c_0 g(n), \forall n>n_0 \)
\(\exists n_0, \exists c_0 > 0\)
\(c_0 \leq f(n) \leq c_1 g(n), \forall n>n_0 \)
\(\exists n_0, \exists c_0, \exists c_1 > 0\)
\(\exists\) means "there exists", \(\forall\) means "for all"
Summary (Limit Definition)
Say we evaluate the limit
\(\lim_{n \to \infty}\frac{f(n)}{g(n)} = c. \)
If \(c = 0\) then \(f(n) = o(g(n))\) and \(f(n) = O(g(n))\)
If \(0 < c < \infty \) then \(f(n) = O(g(n))\), \(f(n) = \Omega(g(n))\) and \(f(n) = \Theta(g(n))\)
If \(c = \infty \) then \(f(n) = \Omega(g(n))\) and \(f(n) = \omega(g(n))\)
How is this different from \(O(g(n))\)?
If you're going to go the path of limits, you need to make sure you justify that your limits exist.
Sum Law
If the limits \(\lim_{n \to \infty} f(n) = A\) and \(\lim_{n \to \infty} g(n) = B\) exist (and are finite) then
\(\lim_{n \to \infty} f(n)+g(n) = A+B\)
Product Law
If the limits \(\lim_{n \to \infty} f(n) = A\) and \(\lim_{n \to \infty} g(n) = B\) exist (and are finite) then
\(\lim_{n \to \infty} f(n)\cdot g(n) = A\cdot B\)
How is this different from \(O(g(n))\)?
L'Hopital's Rule
If we have \(\lim_{x \to \infty} f(x)/g(x) \) in an indeterminate form (i.e. \(0/0\) or \(\pm\infty/\pm\infty\)), then we have
\(\lim_{x \to \infty} \frac{f(x)}{g(x)} \) = \(\lim_{x \to \infty} \frac{f'(x)}{g'(x)} \)
Recursion and Sorting