Learn Sorting Algorithms

Understand the history, complexity, and best use cases for each sorting algorithm.

Comparative Chart

# Algorithm Average Best Worst Space Stable
1 Heap Sort O(n log n) O(n log n) O(n log n) O(1) No
2 Merge Sort O(n log n) O(n log n) O(n log n) O(n) Yes
3 Quick Sort O(n log n) O(n log n) O(n²) O(log n) No
4 Quick3 Sort O(n log n) O(n) O(n²) O(log n) No
5 Shell Sort O(n^1.3) O(n log n) O(n²) O(1) No
6 Insertion Sort O(n²) O(n) O(n²) O(1) Yes
7 Selection Sort O(n²) O(n²) O(n²) O(1) No
8 Bubble Sort O(n²) O(n) O(n²) O(1) Yes

Understanding the Concepts

Before diving into algorithms, it's essential to understand the key metrics used to evaluate and compare them.

Complexity Growth Visualization

This chart shows how different time complexities scale as the input size (n) increases. Notice how O(n²) grows much faster than O(n log n).

Input Size (n) Operations O(1) O(log n) O(n) O(n log n) O(n²) 0 25 50 75 100
O(1) - Constant
O(log n) - Logarithmic
O(n) - Linear
O(n log n) - Linearithmic
O(n²) - Quadratic

Time Complexity

Time complexity describes how the runtime of an algorithm grows as the input size increases. It's expressed using Big O notation.

Best

The minimum time the algorithm takes (ideal input)

Avg

Expected time for random input

Worst

Maximum time the algorithm can take

Space Complexity

Space complexity measures the additional memory an algorithm needs relative to the input size.

O(1)

Constant space - uses fixed memory regardless of input

O(log n)

Logarithmic - typically from recursion stack

O(n)

Linear - memory grows with input size

Stability

A stable sorting algorithm maintains the relative order of elements with equal keys.

Example:

Input: [(A,2), (B,1), (C,2), (D,1)]

Stable: [(B,1), (D,1), (A,2), (C,2)]

Unstable: [(D,1), (B,1), (C,2), (A,2)]

Stability matters when sorting by multiple criteria or preserving previous orderings.

Big O Notation

Big O describes the upper bound of an algorithm's growth rate, focusing on the dominant term.

Notation Name
Ops for n=1000
O(1)Constant1 op
O(log n)Logarithmic~10 ops
O(n)Linear1,000 ops
O(n log n)Linearithmic~10,000 ops
O(n²)Quadratic1,000,000 ops

The table shows how many operations each complexity requires to process 1,000 items. Notice how O(n²) needs 1 million operations while O(n) only needs 1,000.

Practical Tips for Choosing an Algorithm

Small Data (n < 50)

Simple algorithms like Insertion Sort often outperform complex ones due to low overhead.

Nearly Sorted Data

Insertion Sort and Bubble Sort approach O(n) performance on nearly sorted arrays.

Large Data

Use O(n log n) algorithms like Merge Sort, Heap Sort, or Quick Sort for best performance.