click [Follow] for daily coding insights.
Advanced Recursion — Part 2: Recursion Tree and Complexity Analysis
Introduction
Recursion is easy to write — but rarely easy to analyze.
Every recursive call branches into smaller subproblems, forming an invisible execution tree behind the scenes. For professionals, visualizing that tree and quantifying its growth is crucial to understanding time and space complexity.
This post uncovers how to construct and interpret recursion trees, estimate the total cost of recursive calls, and relate them to the Master Theorem — the mathematical backbone of recursion analysis.
1. The Idea Behind the Recursion Tree
A recursion tree represents how a recursive algorithm expands into multiple subcalls.
Each node denotes a function call, and its children represent the calls it spawns.
Example: The Classic T(n) = T(n/2) + O(n)
Take Merge Sort or Binary Search. Each call does some work (O(n)) and then calls itself with a smaller input (n/2).
The recursion tree for T(n) would look like this:
Level 0: n
Level 1: n/2
Level 2: n/4
...
Level k: 1
At each level, the total work forms a pattern.
Visualizing this pattern reveals how total time complexity accumulates.
2. Steps to Build a Recursion Tree
-
Write the Recurrence Relation
Express the algorithm asT(n) = a T(n/b) + f(n)-
a= number of recursive calls -
n/b= size of each subproblem -
f(n)= cost of combining results
-
-
Draw Levels of Calls
Level 0 → 1 call of sizen
Level 1 →acalls of sizen/b
Level 2 →a²calls of sizen/b², etc. -
Compute Cost per Level
Each level contributesa^i * f(n / b^i)work. -
Sum the Total Cost
Add costs across all levels until subproblem size = 1.
Total cost = ∑ (level_cost).
3. Example 1: Binary Search
Binary Search halves the array each step:
T(n) = T(n/2) + O(1)
| Level | Number of Calls | Work per Call | Total Work |
|---|---|---|---|
| 0 | 1 | O(1) | O(1) |
| 1 | 1 | O(1) | O(1) |
| … | log₂ n levels |
Total Work = O(log n).
The recursion tree has a single path — depth = log₂ n.
4. Example 2: Merge Sort
T(n) = 2 T(n/2) + O(n)
| Level | Calls | Work per Call | Total Work |
|---|---|---|---|
| 0 | 1 | n | n |
| 1 | 2 | n/2 | n |
| 2 | 4 | n/4 | n |
| … | log₂ n levels | n per level |
Total Work = n × (log n + 1) = O(n log n).
The recursion tree shows constant total work per level — a hallmark of balanced divide-and-conquer algorithms.
5. Example 3: Fibonacci (Exponential Recursion)
T(n) = T(n-1) + T(n-2) + O(1)
Each call generates two subcalls — a binary tree of height n.
Number of calls ≈ 2ⁿ
Hence, time complexity = O(2ⁿ)
Visualization:
fib(5)
├── fib(4)
│ ├── fib(3)
│ └── fib(2)
└── fib(3)
├── fib(2)
└── fib(1)
Exponential growth arises because identical subproblems repeat.
The recursion tree exposes this inefficiency — motivating memoization or dynamic programming.
6. From Recursion Tree to Master Theorem
Once the recursion tree is clear, you can apply the Master Theorem for quick asymptotic estimation:
For T(n) = a T(n/b) + f(n):
-
If
f(n) = O(n^{log_b a − ε})→T(n) = Θ(n^{log_b a}) -
If
f(n) = Θ(n^{log_b a})→T(n) = Θ(n^{log_b a} log n) -
If
f(n) = Ω(n^{log_b a + ε})→T(n) = Θ(f(n))
Example
T(n) = 2 T(n/2) + n
→ a = 2, b = 2, f(n) = n
n^{log_b a} = n, so second case ⇒ Θ(n log n).
7. Visual Complexity Intuition
| Pattern | Typical Algorithm | Complexity |
|---|---|---|
| One branch | Binary Search | O(log n) |
| Two equal branches | Merge Sort | O(n log n) |
| Many branches | Fibonacci | O(2ⁿ) |
| Unequal branches | Quick Sort (avg) | O(n log n) |
Seeing recursion as a tree of calls transforms complexity analysis from abstract math into intuitive visualization.
8. Space Complexity via Recursion Depth
The height of the recursion tree equals the maximum call-stack depth.
Thus:
-
Binary Search → O(log n) space
-
Merge Sort → O(log n) space
-
Fibonacci → O(n) space
Optimization targets both dimensions — reduce redundant calls and minimize stack depth.
Conclusion
The recursion tree is the programmer’s microscope. It reveals where the time goes, how often subcalls repeat, and how deep the stack grows. Mastering it bridges the gap between writing recursive algorithms and analyzing them like a compiler.
In the next post:
Advanced Recursion — Part 3: Memoization and Dynamic Programming — Turning Exponential Recursion into Polynomial-Time Solutions
.png)
Comments
Post a Comment