• CanadaPlus
    link
    fedilink
    arrow-up
    5
    arrow-down
    1
    ·
    edit-2
    1 day ago

    They don’t prevent it, they just don’t implement it. A real (physical) CPU is a fixed electrical circuit and can’t just divide in two the way it would have to for the ideal mathematical version of recursion. If you want a simple way to visualise this, how would you implement recursion (as opposed to just loops) on a Turing machine with tapes?

    Different CPUs might have strategies to approximate certain kinds of recursion, but at some point my own knowledge ends, and there has been many different designs. Tail recursion in particular is usually just turned back into a loop at the compiler, and typical modern architectures implement a call stack at the hardware level, which allows you to do limited-depth recursion, but breaks like in OP if you try to go too deep.

    • Redkey@programming.dev
      link
      fedilink
      arrow-up
      4
      ·
      17 hours ago

      Tail recursion in particular is usually just turned back into a loop at the compiler, and typical modern architectures implement a call stack at the hardware level, which allows you to do limited-depth recursion, but breaks like in OP if you try to go too deep.

      Yes, in my experience this is what the term “recursion” means in a programming context; it doesn’t usually refer to a mathematical ideal. That was what tripped me up.

      • CanadaPlus
        link
        fedilink
        arrow-up
        1
        ·
        edit-2
        3 hours ago

        The basic definition would be something like use of a function in that function’s own code. It’s pretty easy to find examples that aren’t tail-recursive specifically, like mergesort, and examples within those that would overflow a hardware stack, like in OP. And that’s without looking at (mildly) exotic examples like the Ackermann function.

        Basically, the “Please leave recursion to math and keep it out of (in particular C) software” in OP means don’t define functions using those functions. It’s pretty and it can work, but not reliably.

    • BatmanAoD@programming.dev
      link
      fedilink
      arrow-up
      2
      ·
      15 hours ago

      …what is your point? Some software (in a language that doesn’t have tail-recursion optimization) used recursion to handle user-provided input, and indeed it broke. Someone wrote to explain that that’s a potential vulnerability, the author agreed, and fixed it. Who here is misunderstanding how computers implement recursion?

      • CanadaPlus
        link
        fedilink
        arrow-up
        1
        ·
        edit-2
        3 hours ago

        I was answering a specific question put directly to me. There’s no “point”, exactly.

        Who here is misunderstanding how computers implement recursion?

        Taking a wild stab at how you might be reading this exchange, my original reply was about the title of the post, which implies a CompSci professor would be unhappy about someone criticising the use of recursion in code.

        (in a language that doesn’t have tail-recursion optimization)

        Wait, it doesn’t? I had kind of assumed GCC (for example) would do that at anything greater than -O0.