If you've ever wondered why one algorithm is "faster" than another, Big-O notation is the answer. It's one of the most important concepts in computer science — and once you understand it, you'll never look at a for loop the same way again.
What is Big-O Notation?
Big-O notation describes how the runtime or memory of an algorithm scales as the input size n grows. It answers the question:
"If I double the input, how much slower does my algorithm get?"
It intentionally ignores constants and lower-order terms — because we care about the shape of the growth, not exact numbers.
Key rule: Big-O measures the worst-case scenario.
The Common Complexity Classes
O(1) — Constant Time
The algorithm takes the same amount of time regardless of input size.
Example: Accessing an element in an array by index.
const first = arr[0]; // Always one operation
Real life: Looking up a word in a dictionary using a bookmark.
O(log n) — Logarithmic Time
The algorithm halves the problem with each step. Extremely efficient.
Example: Binary Search — find a number in a sorted array.
function binarySearch(arr, target) {
let lo = 0, hi = arr.length - 1;
while (lo <= hi) {
const mid = Math.floor((lo + hi) / 2);
if (arr[mid] === target) return mid;
else if (arr[mid] < target) lo = mid + 1;
else hi = mid - 1;
}
return -1;
}
If n = 1,000,000, binary search only takes ~20 steps. That's the power of O(log n).
O(n) — Linear Time
Runtime grows directly with input size. One pass through all elements.
Example: Finding the maximum value in an unsorted array.
function findMax(arr) {
let max = arr[0];
for (const item of arr) { // n iterations
if (item > max) max = item;
}
return max;
}
O(n log n) — Linearithmic Time
The sweet spot for efficient sorting algorithms.
Examples: Merge Sort, Quick Sort, Heap Sort.
Most practical sorting problems are solved at O(n log n). If you see Array.sort(), it's typically O(n log n).
O(n²) — Quadratic Time
A nested loop over all pairs. Gets slow fast.
Example: Bubble Sort
function bubbleSort(arr) {
for (let i = 0; i < arr.length; i++) { // n iterations
for (let j = 0; j < arr.length - i; j++) { // n iterations
if (arr[j] > arr[j+1]) {
[arr[j], arr[j+1]] = [arr[j+1], arr[j]];
}
}
}
return arr;
}
With n = 1,000, that's 1,000,000 operations. Compare to Merge Sort's ~10,000.
O(2ⁿ) — Exponential Time
Doubles with every additional input. Only feasible for tiny inputs.
Example: Naïve recursive Fibonacci.
function fib(n) {
if (n <= 1) return n;
return fib(n - 1) + fib(n - 2); // Calculates same values repeatedly!
}
fib(50) makes over a trillion calls. Always avoid this pattern — use memoization or dynamic programming instead.
Quick Reference Chart
| Complexity | Name | n = 10 | n = 100 | n = 1,000 |
|---|---|---|---|---|
| O(1) | Constant | 1 | 1 | 1 |
| O(log n) | Logarithmic | 3 | 7 | 10 |
| O(n) | Linear | 10 | 100 | 1,000 |
| O(n log n) | Linearithmic | 33 | 664 | 9,966 |
| O(n²) | Quadratic | 100 | 10,000 | 1,000,000 |
| O(2ⁿ) | Exponential | 1,024 | 10³⁰ | 10³⁰⁰ |
Space Complexity
Big-O also describes memory usage. A function that creates a new array of size n uses O(n) space. A recursive function that calls itself n times uses O(n) stack space.
Tip: Sometimes you can trade time for space (and vice versa). Memoization uses O(n) space to turn O(2ⁿ) time into O(n) time.
Tips for Success
Always find the dominant term — O(n² + n) simplifies to O(n²).
Drop constants — O(3n) is just O(n). Big-O is about growth shape, not exact speed.
Watch for hidden loops — String concatenation in a loop can be O(n²) in some languages.
Know your data structures — Hash maps give O(1) average lookup; trees give O(log n). Choosing the right structure often matters more than optimizing your algorithm.
Practice Problems
Try to determine the Big-O complexity of these before reading on:
- Finding if an array contains a duplicate element (brute force)
- Checking if a number is prime
- Generating all subsets of a set
- Traversing every node in a binary tree
Answers: O(n²), O(√n), O(2ⁿ), O(n)
Use our Big-O Complexity Cheatsheet to keep this reference handy, and check the Time/Space Complexity Calculator to analyze algorithms interactively.