Please leave recursion to math and keep it out of (in particular C) software: it kills and will kill again.
Kind regards from libexpat, see CVE-2022-25313 and CVE-2024-8176 for proof.
Please leave recursion to math and keep it out of (in particular C) software: it kills and will kill again.
Kind regards from libexpat, see CVE-2022-25313 and CVE-2024-8176 for proof.
They don’t prevent it, they just don’t implement it. A real (physical) CPU is a fixed electrical circuit and can’t just divide in two the way it would have to for the ideal mathematical version of recursion. If you want a simple way to visualise this, how would you implement recursion (as opposed to just loops) on a Turing machine with tapes?
Different CPUs might have strategies to approximate certain kinds of recursion, but at some point my own knowledge ends, and there has been many different designs. Tail recursion in particular is usually just turned back into a loop at the compiler, and typical modern architectures implement a call stack at the hardware level, which allows you to do limited-depth recursion, but breaks like in OP if you try to go too deep.
Yes, in my experience this is what the term “recursion” means in a programming context; it doesn’t usually refer to a mathematical ideal. That was what tripped me up.
The basic definition would be something like use of a function in that function’s own code. It’s pretty easy to find examples that aren’t tail-recursive specifically, like mergesort, and examples within those that would overflow a hardware stack, like in OP. And that’s without looking at (mildly) exotic examples like the Ackermann function.
Basically, the “Please leave recursion to math and keep it out of (in particular C) software” in OP means don’t define functions using those functions. It’s pretty and it can work, but not reliably.
…what is your point? Some software (in a language that doesn’t have tail-recursion optimization) used recursion to handle user-provided input, and indeed it broke. Someone wrote to explain that that’s a potential vulnerability, the author agreed, and fixed it. Who here is misunderstanding how computers implement recursion?
I was answering a specific question put directly to me. There’s no “point”, exactly.
Taking a wild stab at how you might be reading this exchange, my original reply was about the title of the post, which implies a CompSci professor would be unhappy about someone criticising the use of recursion in code.
Wait, it doesn’t? I had kind of assumed GCC (for example) would do that at anything greater than -O0.