• gurupaste@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    3 years ago

    I would like to be able to push my 4k 120hz display with something like a RTX 4090. But its so damn expensive and will require me to upgrade my PSU on top of the $1600 Price of the GPU.

    If the 4080 was cheaper, I would get that. But the 4080 is so expensive itself, I might aswell get the 4090 instead

  • fouc@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    3 years ago

    Can’t really talk about AMD but NVIDIA are at a position to drop every other model except for the x90 without major repercussions. Hell they can even go full enterprise selling these A1000 at a ridiculous profit margin. For NVIDIA at this point the gaming GPU market is just a “good to have”. Artificial segmentation in VRAM aside the chips are just too good to service the consumer market so they might as well sell them for silly money. They don’t particularly care about selling the gaming GPUs because they aren’t losing anything not doing so

    • MrGoodBright@beehaw.org
      link
      fedilink
      English
      arrow-up
      0
      ·
      3 years ago

      I wonder to what extent the lower tier cards are just under performers from the same high tier process.

  • Triage8420@lemmy.ml
    link
    fedilink
    English
    arrow-up
    0
    ·
    3 years ago

    Still rocking my 1070ti. I mostly play overwatch 2 and Minecraft so it works ok for me now. Also I’m broke and can’t afford the upgrade.

    • Captain_Wtv@lemmy.ml
      link
      fedilink
      English
      arrow-up
      0
      ·
      3 years ago

      Some AMD stuff is kind of cheap rn. You can get the 6700XT For around 340 USD. And it’s good performance wise I think. Granted you are on something where you don’t need it imo. I was on something a lot less powerful which is why I made the plunge.

    • Derrek@lemmy.ml
      link
      fedilink
      English
      arrow-up
      0
      ·
      3 years ago

      I had a 1070ti since 2018 and it has run everything I have purchased just fine.

      I thought about checking out this ray tracing stuff the kids are into, but is there a card under $300 that anyone recommends? It also would need to be mini itx as I have a tiny living room gaming PC.

      • Poke@beehaw.org
        link
        fedilink
        English
        arrow-up
        0
        ·
        3 years ago

        Sorry but I’m not sure you’re going to get any good ray tracing experience for less than $300.

        AMD probably has the best general use GPU in that price range.

        Intel probably has the best (with a big asterisk due to driver and directx issues) gaming GPU in that price range.

        It’s just hard to recommend buying a GPU right now imo.

    • sailsperson@beehaw.org
      link
      fedilink
      English
      arrow-up
      0
      ·
      3 years ago

      1080 here. I’m really happy with the decision I made years back. Some games are terribly not optimized, but that won’t make me cash out for a new piece of hardware.

      And anything that’s actually worth upgrading from my GPU is going to be even bigger and block the front panel pins on my new motherboard I was gifted last year. Yep.

  • Carlos Solís@social.azkware.net
    link
    fedilink
    arrow-up
    0
    ·
    3 years ago

    Considering that both Nvidia and AMD have been constantly pushing the prices of baseline GPUs well beyond the golden standard of the 1060, even long after the Big Crypto Spike of 2020? Yeah, barely anyone would bother spending a small fortune on a GPU

    • gumpy@beehaw.org
      link
      fedilink
      arrow-up
      0
      ·
      3 years ago

      barely anyone would bother spending a small fortune on a GPU

      well, except datacenters. can’t get enough of them and the datacenter card prices would make you cry.

    • GalaxyGamer@sh.itjust.works
      link
      fedilink
      arrow-up
      0
      ·
      3 years ago

      Then there is also buying used, which tends to be much cheaper than buying directly from either that it makes sense that their sales are falling off.

    • Communist@beehaw.org
      link
      fedilink
      arrow-up
      0
      ·
      3 years ago

      Not only that, but the used market is skyrocketing, which is just gonna push these numbers even lower.

      • Onihikage@beehaw.org
        link
        fedilink
        arrow-up
        0
        ·
        3 years ago

        There had to have been people in marketing that knew this would happen and were overruled by bean-counting executives. The top card of each generation outdoes the top of the previous gen, but for a couple of generations it’s been increasing in price in almost lock-step with the performance increase. Often the newer card will have worse VRAM than the previous generation’s equal-performing card because you’re looking at an older top-spec card vs a newer midrange, and the midrange cards always have less VRAM. With AAA games now starting to really want more VRAM in order to have better visuals, the older cards wind up actually being the better option long-term.

  • Teali0@kbin.social
    link
    fedilink
    arrow-up
    0
    ·
    3 years ago

    I’m not surprised. I was killing some time at Micro Center yesterday and couldn’t believe how high the prices were. Shelves were pretty much completely full. I found better deals over in the laptops for GPUs. At least with that purchase you get a whole computer.

  • Rhabuko@feddit.de
    link
    fedilink
    English
    arrow-up
    0
    ·
    3 years ago

    It’s even worse if you do creative work on your PC. Nvidia dominates this field completely because of the performance difference. My GPU is old and I really really need a new card for my 3d work but Nvidia is such a awful company…

    • mayooooo@beehaw.org
      link
      fedilink
      English
      arrow-up
      0
      ·
      3 years ago

      I stopped buying new a long time ago, it doesn’t make sense financially or ecologically. It also doesn’t help that I live in a part of Europe where all pc parts are more expensive by default. But used or refurbished is the way to, get a generation older quadro (or whatever they are called now, A something?) and you and your wallet will be happier.

  • UprisingVoltage@feddit.it
    link
    fedilink
    English
    arrow-up
    0
    ·
    3 years ago

    Personally I’m starting to buy second-hand hardware and I recommend it. Less pricey, more eco-friendly and less money in pockes of greedy corps

    • bootyberrypancakes@lemmy.ml
      link
      fedilink
      English
      arrow-up
      0
      ·
      3 years ago

      Almost all of my computer hardware is second-hand! My plex server was free from a guy on reddit and I built a second gaming PC for the TV for maybe $100.

  • ArtVandelay@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    3 years ago

    I bought a 4070Ti for $1k and I deeply regret it. Not because I can’t afford it, but because I let my want of gaming at 120 fps overpower my ethics of enabling a company to get away with these prices. It’s definitely a regret I have.

    • Reeek@beehaw.org
      link
      fedilink
      English
      arrow-up
      0
      ·
      3 years ago

      I feel that way too. My 2080 is still good so the itch isnt as strong but when I play something on my 4k TV and the fps dips below 60 the itch returns. I truly don’t want to buy anything from nvidia or amd even for a good while so here’s hoping Intel keeps at it and doesn’t get stupid expensive as well

    • Behohippy@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      3 years ago

      I paid $1100 for a 3070 during the pandemic with a newegg bundle deal (trash stuff they couldn’t sell). I already had a 2070 and it was a complete waste of money.

  • setInner234@feddit.de
    link
    fedilink
    English
    arrow-up
    0
    ·
    3 years ago

    My hopes are on Intel as viable 3rd party. Looks like AMD and Nvidia have agreed they can fleece customers for piss poor perf/price improvements.

    • citrixscu@beehaw.org
      link
      fedilink
      English
      arrow-up
      0
      ·
      3 years ago

      Yes, and it is quite disappointing. I am hopeful that Intel does well with Battlemage as a potential option.

    • TheTrueLinuxDev@beehaw.org
      link
      fedilink
      English
      arrow-up
      0
      ·
      3 years ago

      With absurd capability of A770, yeah, it beat out on prices so if Intel release a 32 GB VRam GPU on their next round of offering, it’ll crush both AMD and Nvidia if the price is somewhere around $350 and $450.

      (For those wondering, A770 offers about 1/2 as much FLOP performance as 4090 RTX and it have 16 GB of VRAM for the price of $350. That’s insane.)

      If Intel is trying hard, then they could opt to skip the 32 GB VRam and go straight for 64 GB VRam and crush both AMD and Nvidia as a competition, because at that point, they can eat both the consumer and business GPU market share.

  • wagesof@links.wageoffsite.com
    link
    fedilink
    English
    arrow-up
    0
    ·
    3 years ago

    Everyone has a synced upgrade cycle now because EVERYONE upgraded when we were all locked in due to COVID. Does the massive spike in 2021/22 average out to a normalized graph? Yes?

    • cstine@lemmy.uncomfortable.business
      link
      fedilink
      English
      arrow-up
      0
      ·
      3 years ago

      It’s not even only that: crypto mining went from every card the miners could find to literally zero almost overnight. The spike was, honestly, more driven by crypto than gaming during the super-high sales in 2020/21 and then immediately vanished.

      Of course, nVidia ALSO alienated the heck out of a lot of potential buyers who are sitting on the sidelines because they’re not paying the inflated prices caused by that spike, so the crypto guys are gone, and the gamers are waiting.

  • Weerdo@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    3 years ago

    Just, no reason to upgrade yet. My current card plays even the newest games at middling levels which is tolerable. Based on previous experience, I’ll upgrade once every 5 years or so. I’m only two years into this secondhand card.

  • ghashul@feddit.dk
    link
    fedilink
    English
    arrow-up
    0
    ·
    3 years ago

    I’m still running a 1060 6gb card. I’ll keep it for as long as I can, and then I’ll likely upgrade to something that isn’t the newest generation at the time.

    • Jediotty@beehaw.org
      link
      fedilink
      English
      arrow-up
      0
      ·
      3 years ago

      I’ll probably use my 1070 till it dies, and after that if I’m able to fix it :)

    • BOMBS@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      3 years ago

      same. I’m on a 1060ti 6GB doing fine. My machine is at a point that it is maxed out, so I would need to build a whole new one. Still, I’m using Linux exclusively and lag a few years behind the latest games to save on money, so I’m no hurry. I’ll get there when the prices want to come down to my level.

    • snoopfrog@beehaw.org
      link
      fedilink
      English
      arrow-up
      0
      ·
      3 years ago

      Ditto. My 1660 super and 10th gen i5 run Diablo 4 and Lightroom smoothly. No need to upgrade until that’s not the case. It’ll be 3 years young in November.

    • Spitfire@pawb.social
      link
      fedilink
      English
      arrow-up
      0
      ·
      3 years ago

      I’m optimistic for Intel’s upcoming “Battlemage” ARC GPUs.

      Someone needs to break up the stagnation of AMD and Nvidia, and Intel’s in a good spot to do that. I just hope they keep the price low too, just like the first ARC.

      • scoredseqrica@lemmy.ml
        link
        fedilink
        English
        arrow-up
        0
        ·
        3 years ago

        You just know that instead of reducing prices through competition in the GPU space, as soon as intel is broadly competitive feature and power wise, they’ll be rinsing the market just as strong as team red or green. And the new situation will be three sets of overprices gpus instead of two sets of overpriced gpus. This is intel we’re talking about here! They’re not exactly consumer friendly unless they absolutely have to be.

      • iMach@beehaw.org
        link
        fedilink
        English
        arrow-up
        0
        ·
        3 years ago

        There have been some nice sales on the Arc cards. I’m considering picking one up for its AV1 capabilities.

  • mustyOrange@beehaw.org
    link
    fedilink
    English
    arrow-up
    0
    ·
    3 years ago

    No shit. When 1080s from 6 years ago still work fine, there’s clearly some stagnation. They need to cut prices if they want people to actually buy their shit.

    Intel needs to come thru with Battlemage and fuck up team red and and team green

    • patchymoose@beehaw.org
      link
      fedilink
      English
      arrow-up
      0
      ·
      3 years ago

      Whatever happened with Intel’s discrete GPUs? I got whiplash trying to follow the news. At one point I thought the news was that they were discontinuing them altogether. But are they proceeding now?

      • mustyOrange@beehaw.org
        link
        fedilink
        English
        arrow-up
        0
        ·
        3 years ago

        Honestly, pretty damn well. If they keep with it, I see good things for them.

        Imo, the A770 is a lower mid end hero. They’ve really improved their driver support, and I think Battlemage is going to be great.

    • Pigeon@beehaw.org
      link
      fedilink
      English
      arrow-up
      0
      ·
      3 years ago

      I think it helps that AAA graphics got so realistic that improvements feel more incremental relative to older games, and indie games proved that much simpler, cheaper graphics are viable and often even preferred, and devs started going for stylized art over realism more often. Probably also helps that Steam Deck is a thing now, and the Switch allows 3rd party games, so that hardware can be a target to consider too.

      Anyway yeah. I’m still running a 1070, and at absolute worst I might have to reduce some graphics settings in the latest or most poorly optimized games, and we’re long past the days where moderate or even minimal graphics settings looked awful. Games are still beautiful on lower settings.

      A better GPU at this point would net me better FPS in some titles, but those games make up a relatively tiny proportion of what I play, and even then I still get a perfectly playable framerate as is.

      So, yeah, not paying those prices for a tiny upgrade, and not when I remember prices pre-covid and pre-crypto miners. I can afford to wait out their greed.

      • Katana314@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        3 years ago

        I keep explaining to people how the world actually kind of benefits from the Graphical Plateau; but so many insist to me “You will want more pixels. Have you seen raytracing?”

        The Steam Deck mostly gives an upper bounds for how much hardware a game should demand for the next few years, and it’s probably lower than some developers wanted it to be.

        The silliest thing about raytracing in particular is it was planned to be a developer convenience. So in an RTX-only future, we were all going to upgrade to much more powerful GPUs, only to run games that look about as good as what we already have.

        • Pigeon@beehaw.org
          link
          fedilink
          English
          arrow-up
          0
          ·
          3 years ago

          I didn’t know that about raytracing as a developer convenience - that’s really funny.

          I do think raytracing is really cool, and when it’s available I think I’d rather have it than not, all else being equal. But… it seems like the kind of thing that I’d notice and appreciate when it’s there, but I don’t notice its absence, either, and can enjoy my games overall just the same without it.

        • YuzuDrink@beehaw.org
          link
          fedilink
          English
          arrow-up
          0
          ·
          3 years ago

          I absolutely love raytracing… and on my 3080 it just doesn’t look good enough yet to justify turning it on for most games. Maybe they just haven’t implemented it well yet, but the reduced framerate in most games just isn’t worth it, and I’ve hated effects like screen-space reflections since more or less they came out.

          I think by the time we have a 50X0 or a 60X0 that raytracing will finally be fast enough to have it look good AND perform well. But for now it’s mostly just a gimmick I turn on to appreciate, and then turn back off so I can actually play the game smoothly.

          • Pigeon@beehaw.org
            link
            fedilink
            English
            arrow-up
            0
            ·
            3 years ago

            It might be that they’ll put more time and effort into getting it looking right once more people can run it at all, too. I’m not sure what percentage of PC gamers have sufficiently new/powerful GPU’s to run it, but I’d suspect it’s still small, and I’d think there’s only so much time and effort that devs will want to put into something that most people won’t see at all, when they could spend those resources for other aspects of the game (including other aspects of graphics) instead.

            The one thing I would really like now is better audio. Both stuff like better 3D positional audio (e.g. Deathloop if you turn that setting on - although the setting kept turning itself off for me, which was maddening) and more varied and complex sound effects and music. It can make a huge difference, even when people don’t consciously notice.

  • seikoshadow@kbin.social
    link
    fedilink
    arrow-up
    0
    ·
    3 years ago

    Honestly I recently purchased a new RTX 4070 TI and it’s absolutely dominated everything I’ve chucked at it. But it was very expensive for what it was and I can completely understand why people have issues with the latest generations

    • dwindling7373@feddit.it
      link
      fedilink
      arrow-up
      0
      ·
      3 years ago

      What do you even mean by “dominated everything”? It saved you significant time in your workflow? It made games run extrasupermegagood?

      1080p 60fps is plenty prove me wrong.

      • setInner234@feddit.de
        link
        fedilink
        arrow-up
        0
        ·
        edit-2
        3 years ago

        Hmmm, your tone is a bit edgy, but perhaps it was unintentional. The difference between 120fps and 60fps is pretty huge to me. I once had a 4k monitor (on 1440p now), and played on my other 1080p one instead, just for more FPS. Isn’t it a question of preference? Some people prioritise image quality over FPS, some do the opposite. Either is fine, no need to ‘prove anyone wrong’…

        • dwindling7373@feddit.it
          link
          fedilink
          arrow-up
          0
          ·
          3 years ago

          Everything is a matter of preference. I just think the industry manifactured a desire for diminishing improvements (fps, resolution) to drive sales while good and bad games are mostly enjoyed on the basis of being good games.

          Sure you can yell 120 fps from 60. Will you notice it while playing? Unlikely.

          • setInner234@feddit.de
            link
            fedilink
            arrow-up
            0
            ·
            3 years ago

            You are completely right that the games industry is a joke, and it’s a tale as old as time that hardware manufacturers love unoptimised shit, so that they can sell more expensive crap.

            As for the 60fps, or should I call it saturating a 60hz display, I have noticed that some games are fine at 60hz and some games feel terrible up until the higher 90s, and around that level I’m usually fine.

            I used to play a lot of quake 3 back in the day and going from 60 to 120 is like two different worlds entirely.

            I think some people pick up on it and others don’t. I used to work in an office where all the monitors were connected via 4k over HDMI 1 and therefore they were all 30hz. Out of a team of 50, only one graphic designer complained about the laggy monitors and everyone else was moving their mice around saying they couldn’t tell.

            To me it was torture. I don’t know where the truth in the matter is. I think console manufacturers long tried to convince everyone that the eye can’t perceive over 30hz, which is insane.

            Maybe now that’s shifted to trying to make everyone believe they need 240hz, but obviously you’re getting diminishing returns at that level. I’ve never seen more than 144 myself, but even in my own testing, I find more than 100 imperceptible. So I know where my personal limit lies.

            For others, maybe younger people, those limits could be higher, who knows. You often hear of people saying they can see monitors and LEDs flicker. I rarely can.

            Back in the 90s I used to play games at 20fps at 640x480, so perspectives can also shift rather dramatically lol

            Lastly, I can only reaffirm that I’d much rather have well optimised, well designed games with a beautiful art direction than the latest SSAO implementation. Beautiful games from 10 years ago are still beautiful games. Whereas path tracing can’t fix your hot pile of AAA garbage…