The previous link was broken, so I’ve reposted a safer one with archive.org

  • @vithigar@lemmy.ca
    link
    fedilink
    211 year ago

    This article needs a clearer title. I agree that upgrading from a 6000 or 3000 series card right now is almost completely pointless, and even going back another generation it’s still not a great proposition. But I know people with “gaming PCs” rocking 1650s or even 1050s. Lots of folks with medium or low end several generations old hardware out there, for whom great upgrade options exist.

    • VitaMan
      link
      fedilink
      61 year ago

      In March, I upgraded my video card from 1660 to 6750. I am really happy with how much better things look now, especially while gaming.

    • Mewtwo
      link
      fedilink
      31 year ago

      I finished school and want to start gaming again. My PC has a AMD 370 I bought back in 2015. Is that still a decent GPU to play games on today?

      • IntegrationLabGod
        link
        fedilink
        51 year ago

        The 370 will struggle with most anything recent since it’s only 2GB VRAM. The Radeon 6600 would be an excellent upgrade there.

  • @nivenkos@lemmy.world
    link
    fedilink
    151 year ago

    Do that many people upgrade every generation?

    I still use a 1070, so the GPU comparisons here aren’t relevant.

    The main issue I hit was deciding between DDR4 and DDR5 RAM since we’re in an awkward transition phase - and that affects motherboard and so CPU choices too.

    • Upgrading every generation is stupid. I try to upgrade every 5 years if I can afford it.

      My 1080ti says the performance gap versus cost to upgrade is not affordable right now. So I gotta keep waiting.

    • @TheHighRoad@lemmy.world
      link
      fedilink
      51 year ago

      Well, I’ve had the same CPU/Mobo/RAM for over ten years and only upgraded my GPU once from a GTX660 to a 5700xt at the start of the pandemic. I’m finally seeing some issues with some modern AAA content. Hogwarts legacy won’t really run at all, for example.

      I also haven’t wiped my system in the same amount of time, so that may be more the culprit than the system itself. Still going strong!

      • @LyD@lemmy.ca
        link
        fedilink
        41 year ago

        FYI it probably isn’t the 5700XT that’s causing issues in Hogwarts, mine works fine.

        • @TheHighRoad@lemmy.world
          link
          fedilink
          21 year ago

          I think it’s a memory issue, most likely due to the sorry state of my Windows installation. Need to knock off the lazy and wipe it, but it’s pretty remarkable that it works as well as it does. I started with fresh Win7 and have survived upgrades to Win8 and Win10 in addition to the major feature updates that come now and again. I thought it was totally borked a few years back but some obscure automated tool managed to fix it.

          IT BELONGS IN A MUSEUM!

      • @nivenkos@lemmy.world
        link
        fedilink
        31 year ago

        The CPU becomes the real issue though - which then means changing motherboard, which means changing RAM, etc. and then you might as well get an NVMe too etc.

        • @TheHighRoad@lemmy.world
          link
          fedilink
          21 year ago

          I’ve come to realize that I don’t really “upgrade” anything but the GPU and adding storage. I’ve never so much as dropped in a new CPU without going through the whole rigamarole you just described. Build them to last, folks.

        • @mangofromdjango@feddit.de
          link
          fedilink
          1
          edit-2
          11 months ago

          Sometimes you get around that for longer by upgrading to the highest possible configuration on that platform. Often for cheap second hand.

          I replaced my 2017 Ryzen 1800x with a Ryzen 5800x3D recently which is supported on my x370 Motherboard. Huge upgrade, no platform change required. I think I can wait for DDR5 and a new motherboard for years to come.

    • @wccrawford@lemmyonline.com
      link
      fedilink
      31 year ago

      I used to upgrade every generation, and yeah, it was stupidly expensive. But it was my only hobby, and you could actually seen performance increases each time.

      But for the last 10 years or so, there’s much less point. Sometimes there are major advances (Cuda, RTX) that make it worthwhile for a single generation upgrade, but mostly it’s just a few FPS at highest settings. So now I just upgrade every few years.

      • @TheHighRoad@lemmy.world
        link
        fedilink
        21 year ago

        Back in the 90s and early 00s, frequent upgrades were kind of required to stay up to date with new games. The last 10-15 years have been muuuuch slower in that regard, thanks to consoles I guess. I’m not complaining, but I miss the sense of developers really pushing boundaries like they did in the old days.

    • Dandroid
      link
      fedilink
      11 year ago

      The only reason I upgraded my 10 series to a 30 series is because I’m a dummy and bought a monitor with only HDMI 2.1 and no display port, so I needed to upgrade my GPU or I would have no gsync. Otherwise, I probably would have waited at least 2 more generations to upgrade.

  • @Goret
    link
    141 year ago

    Well, like some, I am still on the 10xx series (1060 3gb 🤣🥲) and starting to look to the futur full system upgrade for a Rx79xx or 78xx when out. Targeting Black Friday sale jump

    I would be curious to know if many others are on a refresh cycle up to 4-5years

    (Need to check how to create a poll in Lemmy)

        • @bionicjoey@lemmy.ca
          link
          fedilink
          11 year ago

          Lol. My flair on pcmr used to be “GTX 970: definitely 4gb of VRAM”. Which is itself a pretty outdated joke nowadays

      • @Goret
        link
        21 year ago

        That’s dedication (and to be fair you probably pulling more fps than I do 😂)

    • @Willifire@lemmy.world
      link
      fedilink
      41 year ago

      I have recently ordered a 7900xtx to replace my 1080ti. It was a good companion but just doesn’t cut it anymore. Originally wanted to upgrade with the last gen but scalpers made that impossible. And the used market is still fucked in my region.

    • theblandone
      link
      fedilink
      21 year ago

      I guess I’m on the 7-10 year cycle. I just upgraded from a GTX 1050 (non-Ti) and .an i5-4570. Played almost everything I wanted to play just fine at 1080p and some at 1440p. I tend to be a patient gamer and play mostly indies, so it was great.

      This article feels like it was written in a language I don’t understand. I understand that other people are more into the hobby than I am (which is fine, no judgement, good for them), but its just so far outside what I would consider normal for me that it took me off guard.

    • @scutiger@lemmy.world
      link
      fedilink
      21 year ago

      I upgraded last summer to a 6700XT from a 1070ti. I didn’t need the upgrade, since the 1070ti is still a solid performer even now. There’s not much that it wouldn’t still run now and reasonable settings. I really only upgraded because there was a decent sale, and I had some money burning a hole in my pocket. I could have easily waited another year, and gone with a 6800XT or better for a similar price.

    • @RedStrive@lemmynsfw.com
      link
      fedilink
      11 year ago

      1070ti here. I think the fact that the needle hasn’t moved significantly forward, as the article puts it, has decreased prices to the point where an upgrade to a more updated setup makes sense now for me personally.

      I agree with the article if we’re talking an upgrade from 30’ series gpus, but things seem great for all other cases.

    • @dan1101@lemmy.world
      link
      fedilink
      11 year ago

      Exactly. I’m on a 1050Ti and I’m not sure Starfield will be happy with that. Cyberpunk wasn’t too happy. And of course if I get a new card I will need new MB/CPU/RAM/etc.

  • @testman@lemmy.ml
    link
    fedilink
    131 year ago

    repost

    you know that on Lemmy you can just edit the post, right? Title, url and text, all can be changed.

  • HidingCat
    link
    fedilink
    7
    edit-2
    1 year ago

    The starting premise of the article is based on upgrading from a previous generation. Which sane mind does that? Aside from the one time I got a freebie, all my upgrades were at least two generations apart.

    Edit: Also, coming with certain prices on the RX 6000 series, as long as you’re from three generations behind you’ll get a good upgrade. I went from a GTX 1070 to a RX 6700 XT. Felt a big improvement there.

    • @scutiger@lemmy.world
      link
      fedilink
      51 year ago

      They do make the point in the article that even upgrading from two generations back is a waste, as you’re getting basically getting no real benefit to having waited two generations instead of one. You may as well upgrade to last generation instead of this one and save yourself some money.

      If you’re three generations behind, no matter what your upgrade path is, you’re getting a significant upgrade, but it’s still not worth upgrading to the current gen when last gen is much better value for a marginal performance difference.

      The exception to all this is buying the absolute top-of-the-line, which is never good value, but is again significantly inflated in price from the previous gen.

  • Hutch
    link
    fedilink
    21 year ago

    I’m halfway through a 10 year cycle, with a 1060 3Gb on a 7th Gen i5. It’s mostly Civ6, Stellaris, and Rocksmith 2014 @ 1080p so it’s fine. The main problem is end-of-life for Windows 10 without support in the current hardware, and Rocksmith doesn’t work well on Linux. I’ll probably keep it as-is and start from scratch… when I see a title that I want to play enough to drop big cash on hardware.