Take a look at this simple problem :
Compute $a^b \mod m$
The naive algorithm is $O(n)$, which you multiply $a$ by $b$ times. However, most of you might be aware of the divideandconquer method which runs in $O(\lg n)$ time complexity. The question is : recursive or iterative?
Many will say that recursion is a more natural way to think of, especially when dealing with Dynamic Programming, Segment Trees and DivideandConquer algorithms. But recursive function is known to be slow, and I'll show you the actual runtime data.
Recursive algorithm :
1 2 3 4 

Iterative algorithm :
1 2 3 4 5 6 7 8 9 

I've used some simple bitmanipulation techniques here. Fast explanation:
b & 1 // b % 2 == 1
b >> 1 // b / 2
No doubt that recursion is easier to understand and implement. But after some experiments, I got the runtime data for both algorithms. You can try generating your own input file to try it out.
generator
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 

This is definitely not the best approach to generate test cases, as I don't have much experience in this, but it's enough for this problem.
The runtime is shown in the table below :
N  Recursive  Iterative 
1,000  0.01  0.00 
10,000  0.07  0.06 
100,000  0.28  0.25 
1,000,000  2.46  2.04 
10,000,000  24.05  19.05 
We can see that Recursion runs very slow compared to the iterative program when $N$ is about $10^7$. This is a concrete result. However, I couldn't find a reliable source on the reason about this. My best guess is that because it needs time to call a function, ie take the result and put it back to the previous recursion. Any ideas?
Easy Math Editor
This discussion board is a place to discuss our Daily Challenges and the math and science related to those challenges. Explanations are more than just a solution — they should explain the steps and thinking strategies that you used to obtain the solution. Comments should further the discussion of math and science.
When posting on Brilliant:
*italics*
or_italics_
**bold**
or__bold__
paragraph 1
paragraph 2
[example link](https://brilliant.org)
> This is a quote
\(
...\)
or\[
...\]
to ensure proper formatting.2 \times 3
2^{34}
a_{i1}
\frac{2}{3}
\sqrt{2}
\sum_{i=1}^3
\sin \theta
\boxed{123}
Comments
Sort by:
Top NewestYou're spot on. There's overhead involved in calling a function, and when you do it thousands of time, that overhead adds up.
Consider the following program that simply recurses n times and then iterates n times. Compile and run this with optimizations turned off (because the example is so trivial, if optimizations are on, the recursion is tail call eliminated thus ruining the example).
(Note: if
n
is much bigger than262000
, I get a stack overflow on my machine. If this code immediately crashes for you when you run it, try reducing the size ofn
).On my machine, this outputs:
To call a function, roughly the following is done behind the scenes (you can see this if you look at the assembly): **
As you can imagine (and as the example shows), doing this thousands of times is much slower than just incrementing an integer in a loop.
** The exact details of this are dependent on the ABI of the language/compiler/architecture.
Log in to reply