• u/lukmly013 💾 (lemmy.sdf.org)
    link
    English
    46
    edit-2
    2 months ago

    How can I check this?

    I tried loading (new) Reddit homepage, and based on Network tab in Firefox without any prior cache it transferred 19.20MB compressed to 15.29MB.
    But that also includes any pictures shown.
    Loading lemmy.world homepage transferred 5.88MB compressed down to 1.82MB.
    old.reddit.com 2.82MB compressed down to 947kB. Quite a difference.

    Just for comparison, loading Eaglercraft 1.5.2, a fully functional Minecraft JavaScript clone, complete with LAN multiplayer support took 8.35MB.

    But what exactly is this measuring?

    • @joneskind@lemmy.world
      link
      fedilink
      252 months ago

      But what exactly is this measuring?

      Hard to tell honestly.

      phpBB and Wordpress are websites engines. It doesn’t take into account the content of the websites they are serving, and more importantly the bloated advertising scripts that might be added to the sources.

      Mastodon? What are we even talking about here? The content? The engine? Which instance?

      So, while it’s true that some websites are bloated and some are not, OP’s post says absolutely nothing about it. Size means nothing when a single picture can easily outweigh a huge javascript file mining some bitcoins. For the same reasons, loading times mean nothing either.

      Memory usage, FPS, Cumulative Layout Shift, First Input Delay, Largest Con­tent­ful Paint, any data gathered from the performance API. There are tons of efficient way to measure a website’s efficiency.

      Finally, a website can fail to load for many reasons. First of which can be a 504 Bad Gateway Timeout, which is an event based on an arbitrary value on the server’s side.

    • @bloodfart@lemmy.ml
      link
      fedilink
      12 months ago

      The author of the article used chrome with a cpu throttling setting at 10x to make a comparison between an m3 and itself at 1/10th cpu. I imagine you could check it that way!