Andrés Bedoya G.
Social Engineering Specialist / Social Analyzer / Web Developer / Hardcore JavaScript Developer / Python - Nodejs enthusiastic
WARNING
This section contains some math
Don't worry, we'll survive
Imagine we have multiple implementations of the same function.
How can we determine which one is the "best"?
Suppose we want to write a function that calculates the sum of all numbers from 1 up to (and including) some number n.
function addUpTo(n) {
let total = 0;
for (let i = 1; i <= n; i++) {
total += i;
}
return total;
}
function addUpTo(n) {
return n * (n + 1) / 2;
}
Which one is better?
addUpTo(n) = 1 + 2 + 3 + ... + (n - 1) + n
+ addUpTo(n) = n + (n - 1) + (n - 2) + ... + 2 + 1
2addUpTo(n) = (n + 1) + (n + 1) + (n + 1) + ... + (n + 1) + (n + 1)
n copies
2addUpTo(n) = n * (n + 1)
addUpTo(n) = n * (n + 1) / 2
function addUpTo(n) {
let total = 0;
for (let i = 1; i <= n; i++) {
total += i;
}
return total;
}
let t1 = performance.now();
addUpTo(1000000000);
let t2 = performance.now();
console.log(`Time Elapsed: ${(t2 - t1) / 1000} seconds.`)
Timers...
Rather than counting seconds, which are so variable...
Let's count the number of simple operations the computer has to perform!
function addUpTo(n) {
return n * (n + 1) / 2;
}
3 simple operations, regardless of the size of n
1 multiplication
1 addition
1 division
function addUpTo(n) {
let total = 0;
for (let i = 1; i <= n; i++) {
total += i;
}
return total;
}
n additions
n assignments
n additions and
n assignments
1 assignment
1 assignment
n comparisions
????
Big O Notation is a way to formalize fuzzy counting
It allows us to talk formally about how the runtime of an algorithm grows as the inputs grow
We say that an algorithm is O(f(n)) if the number of simple operations the computer has to do is eventually less than a constant times f(n), as n increases
function addUpTo(n) {
return n * (n + 1) / 2;
}
Always 3 operations
O(1)
function addUpTo(n) {
let total = 0;
for (let i = 1; i <= n; i++) {
total += i;
}
return total;
}
Number of operations is (eventually) bounded by a multiple of n (say, 10n)
O(n)
O(2n)
O(500)
O(13n2)
O(n)
O(1)
O(n2)
function countUpAndDown(n) {
console.log("Going up!");
for (let i = 0; i < n; i++) {
console.log(i);
}
console.log("At the top!\nGoing down...");
for (let j = n - 1; j >= 0; j--) {
console.log(j);
}
console.log("Back down. Bye!");
}
O(n)
O(n)
Number of operations is (eventually) bounded by a multiple of n (say, 2n)
O(n)
function printAllPairs(n) {
for (var i = 0; i < n; i++) {
for (var j = 0; j < n; j++) {
console.log(i, j);
}
}
}
O(n)
O(n)
O(n) operation inside of an O(n) operation.
O(n * n)
O(n2)
Big O of Objects
- Insertion - O(1)
- Removal - O(1)
- Updating O(1)
- Searching - O(N)
- Access - O(1)
Big O of Object Methods
- Object.keys - O(N)
- Object.values - O(N)
- Object.entries - O(N)
- hasOwnProperty - O(1)
Big O of Arrays
- Insertion - It depends
- Removal - It depends
- Searching - O(N)
- Access - O(N)
Big O of Array Methods
- push - O(1)
- pop - O(1)
- shift - O(N)
- unshift - O(N)
- concat - O(N)
- slice - O(N)
- splice - O(N)
- sort - O(N * log N)
- forEach/map/filter/reduce/etc. - O(N)
So far, we've been focusing on time complexity: how can we analyze the runtime of an algorithm as the size of the inputs increases?
We can also use big O notation to analyze space complexity: how much additional memory do we need to allocate in order to run the code in our algorithm?
function sum(arr) {
let total = 0;
for (let i = 0; i < arr.length; i++) {
total += arr[i];
}
return total;
}
one number
another number
O(1) Space
function double(arr) {
let newArr = [];
for (let i = 0; i < arr.length; i++) {
newArr.push(2 * arr[i]);
}
return newArr;
}
n numbers
O(n) Space