• CanadaPlus
    link
    fedilink
    arrow-up
    8
    ·
    edit-2
    1 day ago

    An actual compsci professor would know real CPUs don’t run arbitrary recursion, right? Nobody could possibly be that siloed.

    • Redkey@programming.dev
      link
      fedilink
      arrow-up
      4
      ·
      18 hours ago

      Could someone expand a little on this statement, or point me toward an applicable resource? How do “real” (modern?) CPUs prevent unwanted recursion? As in, not the compiler or the OS, but the CPU itself? I’ve been searching for a while now but I haven’t found anything that clears this up for me.

      • CanadaPlus
        link
        fedilink
        arrow-up
        2
        ·
        edit-2
        8 hours ago

        They don’t prevent it, they just don’t implement it. A real (physical) CPU is a fixed electrical circuit and can’t just divide in two the way it would have to for the ideal mathematical version of recursion. If you want a simple way to visualise this, how would you implement recursion (as opposed to just loops) on a Turing machine with tapes?

        Different CPUs might have strategies to approximate certain kinds of recursion, but at some point my own knowledge ends, and there has been many different designs. Tail recursion in particular is usually just turned back into a loop at the compiler, and typical modern architectures implement a call stack at the hardware level, which allows you to do limited-depth recursion, but breaks like in OP if you try to go too deep.