• teft
    link
    fedilink
    55
    edit-2
    2 months ago

    Just like the human eye can only register 60fps and no more, your computer can only register 4gb of RAM and no more. Anything more than that is just marketing.

    Fucking /S since you clowns can’t tell.

    • @MonkderDritte@feddit.de
      link
      fedilink
      82 months ago

      Jokes on you, because i looked into this once. I don’t know the exact ms the light-sensitive rods in human eyes need to refresh the chemical anymore but it resulted in about 70 fps, so about 13 ms i guess (the color-sensitive cones are far slower). But psycho-optical effects can drive that number up to 100 fps in LCD displays. Though it looks like you can train yourself with certain computer tasks to follow movements with your eye, being far more sensible to flickering.

      • @SorryQuick@lemmy.ca
        link
        fedilink
        32 months ago

        According to this study, the eye can see a difference as high as 500 fps. While this is a specific scenario, it’s a scenario that could possibly happen in a video game, so I guess it means we can go to around 500 hz monitors before it becomes too much or unnessessary.

      • Captain Aggravated
        link
        fedilink
        English
        32 months ago

        Does that refresh take place across the entire eye simultaneously or is each rod and/or cone doing its own thing?

        • teft
          link
          fedilink
          4
          edit-2
          2 months ago

          Are your eyeballs progressive scan or interlaced, son?

      • @iopq@lemmy.world
        link
        fedilink
        22 months ago

        It’s not about training, eye tracking is just that much more sensitive to pixels jumping

        You can immediately see choppy movement when you look around in a 1st person view game. Or if it’s an RTS you can see the trail behind your mouse anyway

        I can see this choppiness at 280 FPS. The only way to get rid of it is to turn on strobing, but that comes with double images at certain parts of the screen

        Just give me a 480 FPS OLED with black frame insertion already, FFS

        • @MonkderDritte@feddit.de
          link
          fedilink
          22 months ago

          Well, i do not follow movements (jump to the target) with my eyes and see no difference between 30 and 60 FPS, run comfortably Ark Survival on my iGPU at 20 FPS. And i’m still pretty good in shooters.

          Yeah, it’s bad that our current tech stack doesn’t allow to just change image where change happens.

    • @TheRedSpade@lemmy.world
      cake
      link
      fedilink
      42 months ago

      This is only true if you’re still using a 32 bit cpu, which almost nobody is. 64 bit cpus can use up to 16 million TB of RAM.

      • teft
        link
        fedilink
        192 months ago

        Sorry I forgot to put my giant /s.

      • Sibbo
        link
        fedilink
        52 months ago

        With PAE, a 32 bit CPU can also use more, but each process is still limited to 4GiB

      • TimeSquirrel
        link
        fedilink
        4
        edit-2
        2 months ago

        This is only true if you’re still using a 32 bit cpu

        Bank switching to “fake” the ability to access more address space was a big thing in the 80s…so it’s technically possible to access addresses that are wider than the address bus by dividing it up into portions that it can see.

    • Pennomi
      link
      fedilink
      English
      -92 months ago

      That’s not sarcasm, it’s misinformation. Not surprising that people downvoted you even though it was just a joke.

      • @starman@programming.dev
        link
        fedilink
        English
        15
        edit-2
        2 months ago

        I don’t think that somebody actually read that computers can’t register more then 4GiB of RAM and then thought

        That’s totally true, because u/teft said it is

        • Pennomi
          link
          fedilink
          English
          32 months ago

          It certainly used to be true, in the era of 32 bit computers.