Take a look at this simple problem :
Compute \(a^b \mod m\)
The naive algorithm is \(O(n)\), which you multiply \(a\) by \(b\) times. However, most of you might be aware of the divide-and-conquer method which runs in \(O(\lg n)\) time complexity. The question is : recursive or iterative?
Many will say that recursion is a more natural way to think of, especially when dealing with Dynamic Programming, Segment Trees and Divide-and-Conquer algorithms. But recursive function is known to be slow, and I'll show you the actual runtime data.
Recursive algorithm :
1 2 3 4
Iterative algorithm :
1 2 3 4 5 6 7 8 9
I've used some simple bit-manipulation techniques here. Fast explanation:
b & 1 // b % 2 == 1 b >> 1 // b / 2
No doubt that recursion is easier to understand and implement. But after some experiments, I got the runtime data for both algorithms. You can try generating your own input file to try it out.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20
This is definitely not the best approach to generate test cases, as I don't have much experience in this, but it's enough for this problem.
The runtime is shown in the table below :
We can see that Recursion runs very slow compared to the iterative program when \(N\) is about \(10^7\). This is a concrete result. However, I couldn't find a reliable source on the reason about this. My best guess is that because it needs time to call a function, ie take the result and put it back to the previous recursion. Any ideas?