Understand the history, complexity, and best use cases for each sorting algorithm.
Builds sorted array one element at a time.
Finds minimum and moves to sorted portion.
Swaps adjacent elements repeatedly.
Insertion sort with diminishing gaps.
Divide and conquer with merging.
Uses binary heap data structure.
Partition around pivot element.
3-way partitioning for duplicates.
| # | Algorithm | Average | Best | Worst | Space | Stable |
|---|---|---|---|---|---|---|
| 1 | Heap Sort | O(n log n) | O(n log n) | O(n log n) | O(1) | No |
| 2 | Merge Sort | O(n log n) | O(n log n) | O(n log n) | O(n) | Yes |
| 3 | Quick Sort | O(n log n) | O(n log n) | O(n²) | O(log n) | No |
| 4 | Quick3 Sort | O(n log n) | O(n) | O(n²) | O(log n) | No |
| 5 | Shell Sort | O(n^1.3) | O(n log n) | O(n²) | O(1) | No |
| 6 | Insertion Sort | O(n²) | O(n) | O(n²) | O(1) | Yes |
| 7 | Selection Sort | O(n²) | O(n²) | O(n²) | O(1) | No |
| 8 | Bubble Sort | O(n²) | O(n) | O(n²) | O(1) | Yes |
Before diving into algorithms, it's essential to understand the key metrics used to evaluate and compare them.
This chart shows how different time complexities scale as the input size (n) increases. Notice how O(n²) grows much faster than O(n log n).
Time complexity describes how the runtime of an algorithm grows as the input size increases. It's expressed using Big O notation.
The minimum time the algorithm takes (ideal input)
Expected time for random input
Maximum time the algorithm can take
Space complexity measures the additional memory an algorithm needs relative to the input size.
Constant space - uses fixed memory regardless of input
Logarithmic - typically from recursion stack
Linear - memory grows with input size
A stable sorting algorithm maintains the relative order of elements with equal keys.
Example:
Input: [(A,2), (B,1), (C,2), (D,1)]
Stable: [(B,1), (D,1), (A,2), (C,2)]
Unstable: [(D,1), (B,1), (C,2), (A,2)]
Stability matters when sorting by multiple criteria or preserving previous orderings.
Big O describes the upper bound of an algorithm's growth rate, focusing on the dominant term.
| Notation | Name |
Ops for n=1000
|
|---|---|---|
| O(1) | Constant | 1 op |
| O(log n) | Logarithmic | ~10 ops |
| O(n) | Linear | 1,000 ops |
| O(n log n) | Linearithmic | ~10,000 ops |
| O(n²) | Quadratic | 1,000,000 ops |
The table shows how many operations each complexity requires to process 1,000 items. Notice how O(n²) needs 1 million operations while O(n) only needs 1,000.
Simple algorithms like Insertion Sort often outperform complex ones due to low overhead.
Insertion Sort and Bubble Sort approach O(n) performance on nearly sorted arrays.
Use O(n log n) algorithms like Merge Sort, Heap Sort, or Quick Sort for best performance.