Growth of Functions

https://slides.com/georgelee/ics141-algorithms-growth/live

Recap

How do we compare algorithms?

Estimating the Growth

  • We determine the number of "operations"
  • Base our estimate on the size of the problem
  • Assume that the size goes to infinity
  • Remove constants, coefficients, and smaller terms

Big O Notation

Let f(x) and g(x) be two functions defined on a subset of real numbers. We say f(x) = O(g(x)) if there are real numbers C and k such that |f(x)|≤ C|g(x)| for any x > k.

 

C and k are "witnesses" to the relationship f(x) = O(g(x))

Examples

 

f(x) = 3x2 + 1 ≤ 3x2 + x2 = O(x2) when x > 1

 

f(x) = 3 = O(1)

 

f(x) = 3x2 + 1 ≤ 3x3 + x3 = O(x3) when x > 1

Smallest Possible Function

Whenever we talk about these notations, we expect to use the smallest possible g(x) such that f(x) is O(g(x)).

Proving f(x) is not O(g(x))

Typically, use a proof by contradiction. Assume that f(x) is O(g(x)), and then figure out that no witnesses C and k exist.

Example

Prove f(x) = 4x is not O(log x)

We use a proof by contradiction. Assume that f(x) is O(log x). Then, there exists a C such that 4x ≤ C log x for x > k.

 

We then have 24x ≤ 2C log x = 24x ≤ (2log x)C24x ≤ xC

 

No C can satisfy this for any large value of x.

"Worst Case Complexity"

Big-O defines a function that is larger than the original function. Thus, it is an upper bound on the algorithm and defines the worst case behavior.

 

What's the worst case and best case for the algorithms that we have covered so far?

Useful Estimates

n-th degree Polynomial

Recall that an n-th degree polynomial is of the form:

 

f(x) = an xn + an-1 xn-1 + an-2 xn-2 + ... a1 x + a0

 

A polynomial of this form is O(xn).

Logarithms and Powers

 

Book explains these in a very confusing way

 

O((log n)c) < O(nd): Logs are smaller than polynomials

 

O(nd) < O(bn): Polynomials are smaller than powers

 

O(bn) < O(cn) where c > b: Larger bases are larger

Combining Functions

Adding Functions

Recall that when working with a polynomial, we took the term with the highest degree. The same intuition follows.

 

More formally: O(f(x) + g(x)) = max(O(f(x)), O(g(x))

 

Example: Give an estimate for x2 + log x

Multiplying Functions

For multiplying functions, consider:

f1(x) = O(g1(x)),f2(x) = O(g2(x))

 

f1(x) * f2(x) = (C1g1(x) * C2g2(x)) = (C1C2)(g1(x) * g2(x))

 

So, f1(x) * f2(x) = O(g1(x) * g2(x))

 

Example: Give an estimate for 3x2 * 7 log x

Other Notations

Big Omega Notation

Let f(x) and g(x) be two functions defined on a subset of real numbers. We say f(x) = Ω(g(x)) if there are real numbers C and k such that |f(x)|≥ C|g(x)| for any x > k.

 

Ω(g(x)) is known as a "lower bound" of the function f(x).

Big Theta Notation

Let f(x) and g(x) be two functions defined on a subset of real numbers. If f(x) is O(g(x)) and f(x) is also Ω(g(x)), then f(x) is also ϴ(g(x)). We also say that f(x) and g(x) are of the same order.

 

Big Theta is known as the "optimum" bound.

Algorithms - Growth of Functions

By George Lee

Algorithms - Growth of Functions

  • 927