Windows 11 often requires new hardware. But that will be extremely pricey or have very little RAM for a while.

I dont believe that a single competent person works at Micro$oft anymore, but maybe maybe this could lead them to make a less shitty OS?

And garbage software like Adobe Creative Cloud too?

They obviously dont care about users, but the pain could become too big.

  • tomkatt@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    ·
    3 hours ago

    There’s plenty of “unbloated” software available. It’s just not on Windows.

  • shiroininja@lemmy.world
    link
    fedilink
    arrow-up
    6
    ·
    5 hours ago

    Do people really use that much Ram with normal use? Like I rarely even fill my 16gb, even with gaming, etc. I mean I just don’t leave 16 tabs open in a browser because that feels really disorganized. And I turn my computer off every night and start fresh every day

    • pantherina@feddit.orgOP
      link
      fedilink
      arrow-up
      5
      ·
      5 hours ago

      Yes. 16GB is the bare minimum for regular usage on Windows. On Linux, it is a minimum for “regular to advanced” usage (i.e. more than 5 more complex programs open, Flatpak, Electron apps)

  • Camille_Jamal@lemmy.zip
    link
    fedilink
    English
    arrow-up
    9
    ·
    7 hours ago

    no, they don’t care about users or if they’re literally cooking ram, they’ll keep it bloated, and probably make it more bloated

  • mycodesucks@lemmy.world
    link
    fedilink
    arrow-up
    18
    ·
    9 hours ago

    It’s a really nice idea, but bad developers are already so deep in the sunk cost fallacy that they’ll likely just double down.

    Nobody reassesses their dogma just because the justification for it is no longer valid. That’s not how people work.

  • CMDR_Horn@lemmy.world
    link
    fedilink
    arrow-up
    145
    ·
    14 hours ago

    Not likely. I expect the AI bubble will burst before those software optimization gears even start to turn.

      • antrosapien@lemmy.ml
        link
        fedilink
        arrow-up
        2
        ·
        7 hours ago

        Yes, but with AI, you can build it in 4 hours, and with all those extra RAMs, it could drop to 2

    • Riskable@programming.dev
      link
      fedilink
      English
      arrow-up
      9
      arrow-down
      3
      ·
      11 hours ago

      Big AI is a bubble but AI in general is not.

      If anything, the DRAM shortages will apply pressure on researchers to come up with more efficient AI models rather than more efficient (normal) software overall.

      I suspect that as more software gets AI-assisted development we’ll actually see less efficient software but eventually, more efficient as adoption of AI coding assist becomes more mature (and probably more formalized/automated).

      I say this because of experience: If you ask an LLM to write something for you it often does a terrible job with efficiency. However, if you ask it to analyze an existing code base to make it more efficient, it often does a great job. The dichotomy is due to the nature of AI prompting: It works best if you only give it one thing to do at a time.

      In theory, if AI code assist becomes more mature and formalized, the “optimize this” step will likely be built-in, rather than something the developer has to ask for after the fact.

  • mushroommunk@lemmy.today
    link
    fedilink
    arrow-up
    87
    ·
    edit-2
    14 hours ago

    It’s not just garbage software. So many programs are just electron apps which is about the most inefficient way of making them. If we could start actually making programs again instead of just shipping a webpage and a browser bundled together you’d see resource usage plummet.

    In the gaming space even before the RAM shortage I’ve seen more developers begin doing optimization work again thanks to the prevalence of steam deck and such so the precedent is there and I’m hopeful other developers do start considering lower end hardware.

    • Suburbanl3g3nd@lemmings.world
      link
      fedilink
      arrow-up
      13
      arrow-down
      2
      ·
      13 hours ago

      Probably a super unpopular take, but the Switch and Switch 2 have done more for game optimization than the Steam Deck has by sheer volume of consoles sold than the Steam Deck ever could. I agree the Steam Deck pushed things further but the catalyst is the Switch/2

      • XeroxCool@lemmy.world
        link
        fedilink
        arrow-up
        7
        ·
        13 hours ago

        I take it the Switch/S2 has many non-Nintendo games shared with other consoles? Hard to search through 4,000 titles on Wikipedia to find them at random, but I did see they had one Assassin’s Creed (Odyssey) at the game’s launch. I never really had Nintendo systems and just associate them with exclusive Nintendo games.

        I’m choosing to believe the Steam Machine will do more of the same for PC games. Maybe it won’t force optimization at launch, but I hope it maintains itself as a benchmark for builds and provides demand for optimization to a certain spec.

        • FoxyFerengi@startrek.website
          link
          fedilink
          arrow-up
          3
          ·
          10 hours ago

          I only own one Nintendo game on my Switch. I’m not going to sit here and pretend most of my games run great on it though. Slay the Spire and Stardew run well. But I’ve had quite a few crashes with Civilization and some hangs with Hades or Hollow Knight too

        • mushroommunk@lemmy.today
          link
          fedilink
          arrow-up
          2
          ·
          11 hours ago

          I try to follow the gaming space and I didn’t really see anyone talk about optimization until the Steam deck grew. I do wish more companies were open about their development process so we actually had some data. The switch/switch 2 very well could have pushed it, but I think with those consoles people just accept that they might not get all the full modern AAA games, they’re getting Pokemon and Mario and such. Where as the steam deck they want everything in their steam library. I dunno

          I have no real data, just what I’ve seen people discussing.

      • CountVon@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        3
        ·
        9 hours ago

        So the developers of PC games like Claire Obscure: Expedition 33, which doesn’t have a Switch version of any kinda, spent time, effort and money to optimize specifically for the Steam Deck… because of the Switch’s market share? Cmon now bud, that’s a straight up ridiculous take.

    • Brkdncr@lemmy.world
      link
      fedilink
      arrow-up
      8
      arrow-down
      3
      ·
      13 hours ago

      Web apps are a godsend and probably the most important innovation to help move people off of Windows.

      I would prefer improvements to web apps and electron/webview2 if I had to pick.

      • bufalo1973@piefed.social
        link
        fedilink
        English
        arrow-up
        10
        ·
        12 hours ago

        If those web apps were using the same shared electron backend then they could be “a godsend”. But each of those web apps uses it’s own electron backend.

        • Brkdncr@lemmy.world
          link
          fedilink
          arrow-up
          3
          ·
          12 hours ago

          The beauty of it is that it electron/webview2 will probably get improved and you don’t need to fix the apps.

          • bufalo1973@piefed.social
            link
            fedilink
            English
            arrow-up
            2
            ·
            10 hours ago

            I don’t disagree with that. But the problem is having one electron backend for each web app and not one backend for all web apps.

    • PeriodicallyPedantic@lemmy.ca
      link
      fedilink
      arrow-up
      1
      ·
      13 hours ago

      Idk, I don’t think the issue is election apps using 100mb instead of 10mb. The kind of apps that you write as html/js are almost always inherently low demand, so even 10x-ing their resources doesn’t really cause a problem, since you’re not typically doing other things at the same time.

      The issue is the kind of apps that require huge system resources inherently (like graphically intensive games or research tools), or services that run in the background (because you’ll have a lot of them running at the same time).

      • mushroommunk@lemmy.today
        link
        fedilink
        arrow-up
        4
        ·
        10 hours ago

        You’re off by a large margin. I’ll use two well documented examples.

        Whatsapp native used about 300mb with large chats. Cpu usage stayed relatively low and constant. Yes it wasn’t great but that’s a separate issue. The new webview2 version hits over a gig and spikes the cpu more than some of my games.

        Discord starts at 1gb memory usage and exceeds 4gb during normal use. That’s straight from the developers. It’s so bad they have started rolling out an experimental update that makes the app restart itself when it hits 4gb.

        These are just two electron apps meant just for chatting mostly. That’s up to 5Gb with just those two apps. Electron and webview2 both spin up full node.js servers and multiple JavaScript heaps plus whatever gpu threads they run, and are exceedingly bad at releasing resources. That’s exactly why they are the problem. Yes the actual JavaScript bundles discord and Whatsapp use are probably relatively small, but you get full chromium browsers and all of their memory usage issues stacked on top.

        • PeriodicallyPedantic@lemmy.ca
          link
          fedilink
          arrow-up
          1
          ·
          8 hours ago

          Right
          But those are only problems because they use the resources in the background. When the foreground app uses a lot of resources it’s not a problem because you only have one foreground app at a time (I know, not really, but kinda). Most apps don’t need to run in the background.

          • mushroommunk@lemmy.today
            link
            fedilink
            arrow-up
            3
            ·
            edit-2
            7 hours ago

            Yes, thats the problem? I’m confused what you’re not getting here. Those programs are made to constantly run. Many people need both for various reasons. Add a main program like Photoshop and then you don’t have enough RAM. People don’t load discord, check a message, close it, load Whatsapp, check it, close it, then load Photoshop.

            The RAM usage doesn’t suddenly stop because you alt+tab to a different program.

            • PeriodicallyPedantic@lemmy.ca
              link
              fedilink
              arrow-up
              1
              arrow-down
              1
              ·
              4 hours ago

              There are, of course, bad offenders.

              I’m just skeptical that “webapps that need a ton of resources and people leave open” is the norm. But I haven’t done any research on it so maybe it is.

  • kboos1@lemmy.world
    link
    fedilink
    arrow-up
    17
    ·
    12 hours ago

    The “shortage” is temporary and artificial, so that’s a hard NO. The ram shortage doesn’t present any incentive to make apps more efficient because the hardware and software that is already in people’s homes won’t be effected by the shortage and people who currently use the software won’t be affected by the shortage. The very small percentage of people that will be affected by the temporary shortage wouldn’t justify making changes to software that is currently in development.

    There’s no incentive for software companies to make their code more efficient until people stop using their software so stop using it and it will get better. Just as an example Adobe reader is crap, just straight up garbage, but people still use it so the app stopped getting improvements many years ago. Then Adobe moved to a subscription based system, and cloud service for selling your data but guess what, it’s still the same app that it was 10 years ago, just more expensive.

    • SkyNTP@lemmy.ml
      link
      fedilink
      arrow-up
      3
      arrow-down
      2
      ·
      9 hours ago

      What crystal ball told you this was temporary? Every day for the past few years the consumer market moves further and further into serving only the wealthy. The people in power don’t care about selling RAM or other scraps to peasants.

      • kboos1@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        3 hours ago

        History and normal market cycles. I’ll remind you of the great GPU shortage caused by Bitcoin miners.

      • 4am@lemmy.zip
        link
        fedilink
        arrow-up
        1
        arrow-down
        2
        ·
        8 hours ago

        Downvoted by libs with their collective heads in the sand.

        It might not wind up working, but Altman and Nadella et. al are trying to push all consumers to forever rent compute from them.

        They do not want you to be able to run your own Deepseek at home. They do not want you to control the hub of your smarthome. They want to know what’s in the spreadsheet you saved, what’s in the business plan you typed up, and when the password is to any E2EE service you have an account with.

        They want to forecast you like the weather.

        • shiroininja@lemmy.world
          link
          fedilink
          arrow-up
          2
          ·
          5 hours ago

          They’re still losing billions a year. Open AI’s future doesnt even seem certain yet. Eventually investors will catch on

  • ChillPC@programming.dev
    link
    fedilink
    arrow-up
    28
    arrow-down
    1
    ·
    14 hours ago

    You fool, humans are flexible enough to get used to slow experiences. Even if the average user needs to have discord, slack, 100 chrome tabs, word and any other electron app opened simultaneously, he will just go through his work. He may not be happy with it but still continue without changing his habits.

    But to be honest, I goddamn hope you are right!

    • pantherina@feddit.orgOP
      link
      fedilink
      arrow-up
      1
      ·
      6 hours ago

      The impact is that your software runs even worse on existing hardware. Might not be a big impact, but an impact

    • atro_city@fedia.io
      link
      fedilink
      arrow-up
      5
      ·
      14 hours ago

      Why do you believe so? Do you believe software developers earn too much to care about RAM prices and will continue to write software that requires more RAM than the rest of the world can afford?

      • CarbonatedPastaSauce@lemmy.world
        link
        fedilink
        English
        arrow-up
        11
        ·
        13 hours ago

        Because that kind of shift in mindset (going backwards, basically) will require far more pressure than a 1-2 year RAM shortage.

        Enterprise developers are basically unaffected by this. And anyone writing software for mom & pop was already targeting 8gb because that’s what Office Depot is selling them.

        This mostly hurts the enthusiast parts of tech. Most people won’t notice, because they don’t know the difference between 8, 16, or over 9000 gb of RAM. I’ve had this discussion with ‘users’ so many times when they ask for pc recommendations, and they just don’t really get it, or care.

      • drcobaltjedi@programming.dev
        link
        fedilink
        arrow-up
        4
        ·
        12 hours ago

        As a software dev, theres a lot of stuff thats just bloat now. Electron apps are really easy to make pretty and write for web devs and are super portable, but each one is literally an instance of a chrome browser. Theres still a lot of devs that care (to some degree) about performance and are willing to trim fat or take small shortcuts where viable.

        However theres also the issue of management. I once was tasked with a problem at work dealing with the traveling salesman problem. I managed to make a very quick solution that worked fairly well and was fast but always left 1 point for last that probably should have been like point 3. Anyway, it was quick and mostly accurate, but my boss told me to “fix it” and in spite of my explaination that hes asking me to solve an unsolved math problem he persisted. I am now ashamed of how slow that operation is now since instead of just finding the nearest point it now needs to look ahead a few steps to see what path is shorter.

      • bufalo1973@piefed.social
        link
        fedilink
        English
        arrow-up
        1
        ·
        12 hours ago

        I remember in the 80s a PC programmer that did his programs in GWBASIC and when I asked him why was he using that instead of a better language that could make faster a smaller programs his answer was “if this doesn’t run fast enough in the client’s PC then the client will buy a better PC”. That’s the mindset, “it’s not my problem once I sell it”.

      • badgermurphy@lemmy.world
        link
        fedilink
        arrow-up
        8
        ·
        13 hours ago

        For the most part, the answer seems to be yes. Some products did also ship with missing or reduced feature sets for a time, too.

      • magic_lobster_party@fedia.io
        link
        fedilink
        arrow-up
        7
        ·
        13 hours ago

        Dealing with memory usage will likely require significant rewrites and architectural changes. It will take years.

        The ”memory optimizations” we’ll see is the removal of features but charge the same price. Software shrinkflation. Will require same amount of memory though.