Algorithms & Complexity
Big O Visualiser
n is the input size, meaning the number of items being processed (e.g. elements in a list to sort, records to search). Big O notation describes how the number of operations grows as n increases. Toggle curves, drag Max n to extend the x-axis, and hover the chart to compare exact operation counts at any input size.
Hover over the chart to see exact operation counts at any input size.
Operations at n = 20
Real-world time at n = 20
Assumes a modern CPU running 109 (one billion) simple operations per second. These are order-of-magnitude estimates, not exact timings.
How much slower? (relative to fastest active curve at n = 20)
Shows how many times more operations each complexity requires compared to the fastest active curve at the current n.
Exam Tips: Big O Notation
- Big O describes an upper bound on how the number of operations grows, not the exact count. It is conventionally used to represent the worst-case growth rate, so you will often see the terms used interchangeably at GCSE and A-Level.
- Constants are dropped: O(3n) is written as O(n). O(n² + n) is written as O(n²). Only the dominant term matters.
- Binary Search is O(log n) because each step halves the remaining data. This is why it is dramatically faster than Linear Search O(n) for large datasets.
- Nested loops usually mean O(n²): a loop inside a loop where both depend on n gives n × n operations.
- O(2ⁿ) is considered impractical for large n. Each additional element added to the input doubles the number of operations. Turn it on in the chart and compare it to O(n²) to see how quickly it becomes unworkable.
Which Algorithms Have Which Complexity?
O(1) : Constant
The same number of operations regardless of input size. The ideal, but rare.
Array index access
Hash table lookup
Stack push/pop
O(log n) : Logarithmic
Operations grow very slowly. Doubling n adds roughly one extra step.
Binary Search
BST operations (balanced)
O(n) : Linear
Each element is visited once. The work scales proportionally with input size.
Linear Search
Traversing a list
Finding max/min
O(n log n) : Linearithmic
Slightly worse than linear but much better than quadratic. The best achievable for comparison-based sorting.
Merge Sort
Quicksort (average)
Shell Sort (best gap sequence)
O(n²) : Quadratic
Doubling n quadruples the work. Common in simple sorting algorithms with nested loops.
Bubble Sort
Selection Sort
Insertion Sort (worst)
O(2ⁿ) : Exponential
Each additional input element doubles the work. Impractical beyond very small n.
Naive recursive Fibonacci
Brute-force subsets
Complexity Quick Reference