• Grerkol@leminal.space
    link
    fedilink
    English
    arrow-up
    5
    ·
    edit-2
    10 hours ago

    I know very little about CGI, so sorry if this is dumb I guess…

    But why would they even consider using a game engine in the first place instead of a program like Maya or Blender? Is it just a bit easier to use for simple things or something? Surely everyone who works for the studios is already used to using software that’s actually specifically for 3D modeling/animation. Also surely Maya/Blender will always give significantly higher quality renders anyway since it doesn’t have to render in real time like a game would… just why?

    • novibe@lemmy.ml
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 hour ago

      Rendering in real time is, well, real time.

      It takes dozens of hours to render seconds of some CGI movies.

      It’s just cheaper in time and literal energy costs to use game engines that live render everything.

    • Vritrahan@lemmy.zip
      link
      fedilink
      English
      arrow-up
      5
      ·
      9 hours ago

      Like he said, shortcuts. You have to make everything by hand in Maya, including the lighting.

  • 30_to_50_Feral_PAWGs [she/her]@hexbear.net
    link
    fedilink
    English
    arrow-up
    28
    ·
    1 day ago

    What fucking brain genius thought a game engine would be a good replacement for Maya in anything but a blocking/proof of concept usage scenario?! UE5 on top-of-the-line hardware looks all right, but it’s not “video production” quality by a long shot.

  • MrGabr@ttrpg.network
    link
    fedilink
    English
    arrow-up
    22
    ·
    24 hours ago

    To everyone saying it’s a slip backwards for games, too, it’s more complicated than that. It’s absolutely possible to make a game that runs at more than 90 fps in UE5; I’ve done it in VR. The engine just makes it super easy to be lazy, and when you combine that with modern AAA “optimization is for suckers” game dev philosophy, that’s where you get performance like Borderlands 4.

    I think people only notice UE5 games running badly, and don’t realize when it’s fine. Clair Obscur was in UE5 and I never dropped below 60fps on max settings except in one area. Avowed was in UE5, probably a really early version like 5.2 or 5.3, based on when it released (the latest it could’ve been is 5.5, but it’s bad practice to switch major engine versions too far into development, so I’d doubt they updated even to 5.4). Avowed had bugs for sure, but not performance issues inherent to the engine.

    I think blaming UE5 lets lazy development practices off easy. I’ll take it over Unity for sure (I’ve experienced Unity fail at basic vector math, let alone that no one should ever trust them again after that per-install fee stunt). We should be maintaining that same frustration at developers for not optimizing. Lumen was not ready when it came out, and Nanite requires a minimum hardware spec that’s still absurd, but it’s literally two switches to flip in project settings to turn those off. UE5 is really an incredible piece of technology and it has made, and continues to make, game making accessible on a scale comparable to when Unity added a free license. AAA developers get off easy when you blame the engine instead of their garbage code.

    ~Godot is a beautiful perfect angel that needs a new 3D physics engine~

    • Horse {they/them}@lemmygrad.ml
      link
      fedilink
      English
      arrow-up
      1
      ·
      3 hours ago

      I’ve experienced Unity fail at basic vector math, let alone that no one should ever trust them again after that per-install fee stunt

      mildly related: over the years i have seen a concerning amount of game updates, some of which have not been updated for years prior, with a single line in the changelog that says “fixed unity security vulnerability”

      • MrGabr@ttrpg.network
        link
        fedilink
        English
        arrow-up
        2
        ·
        39 minutes ago

        There was a bug recently fixed in Unity where if your system was already infected, a virus could run any code through Unity, possibly gaining privileges. I felt Unity slightly overstated the severity in their announcement to developers, but when you get an email from Unity saying “we fixed a critical engine vulnerability, update your game ASAP,” it can be quite panic-inducing.

    • gaycomputeruser [she/her]@hexbear.net
      link
      fedilink
      English
      arrow-up
      9
      ·
      19 hours ago

      The problem isn’t just the performance, but UE5 doesn’t look very good - especially given the amount of hardware that’s needed. Some of the biggest problems in my mind are the bluryness of the image (apparently due to lots of temporal techniques) and the UE5 lighting which gives the games a very distinct and unrealistic look, when compared to other engines. Further, the vast majority of skin in UE is terrible.

      • MrGabr@ttrpg.network
        link
        fedilink
        English
        arrow-up
        7
        ·
        16 hours ago

        That’s fair, and you really see that on games like Norse where they don’t have the resources to make custom material and post-processing shaders, but they still want it to look like AAA photorealism (a bad strategy to begin with but that’s their problem). Out of the box, though, UE5 still looks leagues better than anything else that isn’t proprietary, and I’d argue that if you do have the time/staff to dedicate an entire team to technical art, the ceiling of how good UE5 can look, if you’re going for photorealism, is higher than it is for Unity and Godot as well.

        To the original context of the post, that ceiling is still way lower than what should be acceptable quality for big-budget movie CGI, but regarding games, I’m gonna stick to my original point and say that’s still an issue on the developers’ part for not putting in the effort to make it look good. Even accounting for optimization and visual tweaking, they’re still saving enormous amounts of time and money by using UE5 instead of their own engine, and that effort should be expected, the lack thereof not excused.

        • gaycomputeruser [she/her]@hexbear.net
          link
          fedilink
          English
          arrow-up
          4
          ·
          edit-2
          13 hours ago

          That’s a very fair point on the graphical quality! From my pov it seems like epic should needs to make it easier for developers to optimize their projects, given the number of games that haven’t had a lot of that work done. I’m sure that’s easier said than done though.

          It really is strange to me that ue5 is being used for games given there are other raster rendering engines that are designed for better image quality. I’m assuming here that part of the benefit the studios are looking for is the speed increase from not having to prerender scenes on larger server farms, and the flexibilty they get from systems like Disney’s “the volume” system.

          • MrGabr@ttrpg.network
            link
            fedilink
            English
            arrow-up
            2
            ·
            10 hours ago

            AFAIK, the speed increase to allow technology like the volume is the whole pitch. Not every studio has an entire volume, so lower-budget filmmakers can set up a system with a green screen where the cinematographer can see the CGI environment in real-time through the camera, and with the asset store integration, indie filmmakers can have an insane set/backdrop for a tiny fraction of the normal price.

            Now that I think of it, though, I think Mr. Verbinski here is placing undue blame on UE5 when Marvel’s CGI has been getting worse and worse because they throw an army of slaves at the footage after the fact, rather than paying artists and working with them to set up shots to make the CGI as easy as possible, like he did.

    • JakenVeina@midwest.social
      link
      fedilink
      English
      arrow-up
      3
      ·
      edit-2
      16 hours ago

      My example would be Satisfactory. That game ran GREAT for years on my freakin’ 10-year-old 1070. It was only in 2025 that I started having some minor framerate issues in areas with a whole lot of cosmetics and machinery (which is inevitable in factory/automation genre, where the game really can’t control how much players will ask it to render). And then I had the SAME kinds of issues after upgrading to a 3080, until I switched to Linux, so really the 1070 might never have been the issue, anyway.

  • Awoo [she/her]@hexbear.net
    link
    fedilink
    English
    arrow-up
    22
    arrow-down
    1
    ·
    1 day ago

    Greatest slip backwards for games too, I will not get above 60fps in most games using it without framegen. I’m probably misplacing blame on Unreal for this though, AI framegen existing caused this as it made devs lazier.

    • hypercubie4@lemmygrad.ml
      link
      fedilink
      English
      arrow-up
      7
      ·
      edit-2
      1 day ago

      I agree, in that most games that I play (not using UE5); I just set the basic settings I want, and get above 90fps.

      messing with the settings (to get above 60fps) in an UE5 engine game is a chore, and most times frame gen barely works, upscaling looks like shit too.

      but imo devs didn’t get lazier, UE5 + upscale/framegen made it easier for them to not have to optimize a game, and then (more often than not) when they are forced to crunch, optimizations are frequently out of the question, especially on launch.

      • Awoo [she/her]@hexbear.net
        link
        fedilink
        English
        arrow-up
        8
        ·
        1 day ago

        but imo devs didn’t get lazier, UE5 + upscale/framegen made it easier for them to not have to optimize a game, and then (more often than not) when they are forced to crunch, optimizations are frequently out of the question, especially on launch.

        I mean, that specifically made them lazier. They could say the game gets 100fps when in reality it’s 50fps without DLSS. Then they would do other shit instead.

        The option existing has made them lazier about optimising and made the quality of the product lower. Practically everyone complains about performance now compared to games pre-framegen where you could expect a high-end gpu to actually perform. Now even the 4070 I have barely hits 50 without DLSS in most new games.

        It annoys me considerably because wtf do I even have a high refresh rate monitor for if the games are running aren’t even running at half the fps of the hz my monitor has?

            • hypercubie4@lemmygrad.ml
              link
              fedilink
              English
              arrow-up
              1
              ·
              2 hours ago

              when they are forced to crunch, optimizations are frequently out of the question, especially on launch.

              I mean, that specifically made them lazier…

              • Awoo [she/her]@hexbear.net
                link
                fedilink
                English
                arrow-up
                1
                ·
                1 hour ago

                Oh right I see now. Yes. You and I both agree that part is not really the cause. Devs were being forced to crunch long before recent times and they had more optimised games in the past.

                • hypercubie4@lemmygrad.ml
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  16 minutes ago

                  i guess i should have added that, crunch never left, it’s just changed with the landscape. i agree that the cause of poorer optimizations stem from UE5 + mentioned “features.”

                  but i wouldn’t go so far as to say a modern day game dev as a worker is lazy, perhaps the leads and above in a AAA studio are, absolutely. but someone who directly works on a game? that’s not an UE5 problem, moreso an industry problem.

        • booty [he/him]@hexbear.net
          link
          fedilink
          English
          arrow-up
          4
          ·
          24 hours ago

          I think the Oblivion remaster is turning me into the joker. The game looks fine, unless you want it to run at reasonable FPS. Then you have to turn on the AI slopifier which makes the game look worse than the original. So you can run it half the FPS of the original with stuttering.

          What the fuck is the POINT

  • MrGabr@ttrpg.network
    link
    fedilink
    English
    arrow-up
    2
    ·
    24 hours ago

    To everyone saying it’s a slip backwards for games, too, it’s more complicated than that. It’s absolutely possible to make a game that runs at more than 90 fps in UE5; I’ve done it in VR. The engine just makes it super easy to be lazy, and when you combine that with modern AAA “optimization is for suckers” game dev philosophy, that’s where you get performance like Borderlands 4.

    I think people only notice UE5 games running badly, and don’t realize when it’s fine. Clair Obscur was in UE5 and I never dropped below 60fps on max settings except in one area. Avowed was in UE5, probably a really early version like 5.2 or 5.3, based on when it released (the latest it could’ve been is 5.5, but it’s bad practice to switch major engine versions too far into development, so I’d doubt they updated even to 5.4). Avowed had bugs for sure, but not performance issues inherent to the engine.

    I think blaming UE5 lets lazy development practices off easy. I’ll take it over Unity for sure (I’ve experienced Unity fail at basic vector math, let alone that no one should ever trust them again after that per-install fee stunt). We should be maintaining that same frustration at developers for not optimizing. Lumen was not ready when it came out, and Nanite requires a minimum hardware spec that’s still absurd, but it’s literally two switches to flip in project settings to turn those off. UE5 is really an incredible piece of technology and it has made, and continues to make, game making accessible on a scale comparable to when Unity added a free license. AAA developers get off easy when you blame the engine instead of their garbage code.

    ~Godot is a beautiful perfect angel that needs a new 3D physics engine~