Algorithmic

Analysis

CODE 101

What are Algorithms?

Algorithms are ways to solve a certain problem.

An algorithm tells us in step-by-step procedure how to do certain operations and when to do them so as to facilitate the solving of the given problem.

 

Algorithm to check primality of a number

In computer programming, an algorithm are a set of statements which execute in a certain order so as to produce/compute an output (which might be based on an input).

Slow vs Fast algorithms

 

A computer executes the statements of a program. It doesn’t know what it is doing.

It is the job of the programmer to make sure the algorithm being executed is a good (fast) and elegant (compact) one so as to make optimum use of the computer’s computational performance and memory resources.

 

A bad (slow) algorithm may use the best hardware available and still run slower than a good (fast) algorithm on an old computer system.

Examples of slow algorithms:

  • Matrix multiplication (CPU based)

  • Cryptographic key decryption (brute-force)

  • Determinant calculation using Laplace expansion

  • Factoring integers (prime integers)

 

Examples of fast algorithms:

  • Hash tables

  • Sorting algorithms such as quick sort, merge sort, heap sort

  • Binary search

  • Linear search (for small inputs)

Complexity Analysis

 Complexity of an algorithm gives us an idea about how the algorithm will run on a computer system.

The execution time of the algorithm depends on

 

Processor performance

 

Memory (RAM)

 

Inputs

 

Disk speeds (I/O)

 

Network speed (if required)

 

Power-consumption (for mobile devices)

These factors affect how the algorithm might perform in the real world.

But for theoretical purposes only, knowing how the execution time of an algorithm grows with increasing size of inputs (N) is a good enough measure of its expected performance.

 

Two main metrics of theoretical performance are:

 

Time Complexity

Space Complexity

Order of growth:

Big-O notation

Big-O notation is a mathematical notation describing the limiting nature of a function. 

 

In the context of algorithmic analysis, Big-O notation allows us to get an idea of the asymptotic upper bound (<=) of the execution time or the space requirement as the input (N) of the algorithm grows larger.

Some other notations:

 

 

These notations are used to determine and classify the algorithms according to their performance. Generally, Big-O notation is used.

Time complexity

and

Space complexity

Time complexity is a function describing the amount of time an algorithm takes in terms of the amount of input to the algorithm. "Time" can mean

 

  • the number of memory accesses performed

  • the number of comparisons between integers

  • the number of times some inner loop is executed

Space complexity is a function describing the amount of memory (space) an algorithm takes in terms of the amount of input to the algorithm. We often speak of "extra" memory needed, not counting the memory needed to store the input itself.

For both time and space, we are interested in the asymptotic complexity of the algorithm: When n (the number of items of input) goes to infinity, what happens to the performance of the algorithm?

 

 

 

 

 

 

 

 

For an efficient algorithm, there seems to always be a space-time trade-off.

A task can be handled either by using a fast algorithm which used quite a lot of working memory, or by using a slower algorithm which used very little working memory.

 Computing complexity of an algorithm

We will be finding time as a function of the input, T(n) to determine the exact or the upper-bound of the time it takes for an algorithm to run, that is to produce an output according to the given input.

 

 

A few algorithms that we shall analyse:

  1. Linear Search
  2. Binary Search
  3. Bubble Sort
  4. Fibonacci Sequence

Iterative algorithms:

 

  • for loops: repeats statements finite number of times, number of iterations can be counted.

 

  • while loops: repeats statements till condition is met, finding number of iterations are a little bit difficult.

 

  • assignments, comparisons, fetching array element by index: can be assumed to have a constant time cost.

Recursive algorithms:

 

Each recursive call contributes to the execution time. Can use repeated substitution, recursion tree or Master theorem to solve T(n).

Other statements can be assumed to have constant time cost.

 

To find the Big-O notation:

 

  1. From T(n) find fastest growing term

  2. Drop the coefficients

Resources

Algorithmic Analysis

By animesh ghosh

Algorithmic Analysis

An informative seminar on Analysis of Algorithms.

  • 119