A recursive function calls itself with a smaller version of the problem until it hits a base case that stops the chain.
Every recursion needs two things:
def factorial(n):
if n <= 1: # base case
return 1
return n * factorial(n - 1) # recursive case
Each recursive call pushes a frame onto the call stack. When the base case returns, frames pop in reverse order.
factorial(4)
→ factorial(3)
→ factorial(2)
→ factorial(1) → returns 1
← returns 2
← returns 6
← returns 24
If there's no base case, the stack overflows. Python's default recursion limit is 1,000 frames.
The naïve recursive Fibonacci is O(2ⁿ) because it recomputes the same sub-problems. Memoisation fixes this in O(n) time and space.
from functools import lru_cache
@lru_cache(maxsize=None)
def fib(n):
if n < 2:
return n
return fib(n - 1) + fib(n - 2)
Backtracking is recursion with a twist: you make a choice, recurse, then undo the choice before trying the next option. Think of it as exploring a maze - walk forward, hit a dead end, walk back, try the next path.
def backtrack(state, choices):
if is_solution(state):
result.append(state.copy())
return
for choice in choices:
if is_valid(choice):
state.add(choice) # choose
backtrack(state, ...) # explore
state.remove(choice) # un-choose
This three-step pattern - choose, explore, un-choose - is the backbone of nearly every backtracking problem.
In the backtracking template, why do we undo the choice after the recursive call?
Generate all orderings of [1, 2, 3]. At each level, pick an unused number, recurse, then un-pick.
[]
/ | \
[1] [2] [3]
/ \ ...
[1,2] [1,3]
| |
[1,2,3] [1,3,2] ... (6 total)
There are n! permutations, so the time complexity is O(n × n!).
Combinations (n choose k): at each element, include it or skip it, but only recurse forward to avoid duplicates.
Subsets: same idea without a size constraint - every node in the recursion tree is a valid subset. There are 2ⁿ subsets.
Place N queens on an N×N board so none attack each other. For N=4:
. Q . .
. . . Q
Q . . .
. . Q .
Strategy: place one queen per row. For each row, try every column. Check column, main diagonal, and anti-diagonal conflicts. If safe, place and recurse to the next row. If stuck, backtrack.
Track conflicts with three sets: cols, diag (row − col), and anti_diag (row + col).
In the N-Queens problem, how do we efficiently check diagonal conflicts?
Find an empty cell, try digits 1–9. For each digit, check row, column, and 3×3 box constraints. If valid, place and recurse. If no digit works, undo and backtrack. The pruning from constraints makes it tractable despite the huge search space.
Given a 2D board of letters and a target word, start from every cell and DFS in four directions, matching one character at a time. Mark cells as visited during recursion and unmark on backtrack to allow other paths.
The cost depends on the branching factor (choices per step) and depth (steps to a solution). For b branches and d depth: O(bᵈ).
| Problem | Branching | Depth | Complexity | |---------|-----------|-------|------------| | Permutations | n, n−1, … | n | O(n!) | | Subsets | 2 | n | O(2ⁿ) | | N-Queens | ~n | n | O(n!) worst case |
Pruning means skipping branches you know will fail. Examples:
Good pruning can turn an impractical search into a fast one.
What does pruning do in a backtracking algorithm?