• @njordomir@lemmy.world
    link
    fedilink
    English
    132 months ago

    Yep, that seems like the ideal decentralized solution. If all the info can be distributed via torrent, anyone with spare disk space can help back up the data and anyone with spare bandwidth can help serve it.

    • @Shdwdrgn@mander.xyz
      link
      fedilink
      English
      82 months ago

      Most of us can’t afford the sort of disk capacity they use, but it would be really cool if there were a project to give volunteers pieces of the archive so that information was spread out. Then volunteers could specify if they want to contribute a few gigabytes to multiple terabytes of drive space towards the project and the software could send out packets any time the content changes. Hmm this description sounds familiar but I can’t think of what else might be doing something similar – anyone know of anything like that that could be applied to the archive?

      • @njordomir@lemmy.world
        link
        fedilink
        English
        52 months ago

        Yeah, the projects I’ve heard about that have done something like this broke it into multiples.

        For example, 1000GB could be broken into forty 25GB torrents and within that, you can tell the client to only download some of the files.

        At scale, a webpage can show the seed/leach numbers and averages foe each torrent over a time period to give an idea of what is well mirrored and what people can shore up. You could also change which torrent is shown as the top download when people go to the contributor page and say they want to help host it ensuring a better distribution.

      • @rottingleaf@lemmy.world
        link
        fedilink
        English
        02 months ago

        Since I’m spamming with this same idea right now - the description is similar to Freenet (the old one, the Hyphanet), but you’d need some kind of ability to choose parts of which collections of data get stored in your contributed storage, while with Freenet it’s all the network (unless you form a separated F2F net, there is such an option, but no way to be sure that all peers, ahem, store only IA data and not their own porn collections, for example, taking precious storage). I’ve described one idea in my previous comment, but it’s purely an idea, I’m nowhere close to having the knowledge to make such.

    • @rottingleaf@lemmy.world
      link
      fedilink
      English
      12 months ago

      There’s an issue with torrents, only the most popular ones get replicated and the process is manual\social.

      Something like Freenet is needed, which automatically “spreads” data over machines contributing storage, but Freenet is an unreliable storage, basically like a cache where older and unwanted stuff gets erased.

      So it should be something like Freenet, but possibly with some “clusters” or “communities” with a central (cryptography-enabled) authority of each being able to determine the state of some collection of data as a whole, and pick priorities. My layman’s understanding is that this would be similar to something between Freenet and Ceph, LOL. More like a cluster filesystem spread over many nodes, not like cache.

      • @njordomir@lemmy.world
        link
        fedilink
        English
        1
        edit-2
        2 months ago

        You have more knowledge on this than I did. I enjoyed reading about Freenet and Ceph. I have dealt with cloud stuff, but not as much on a technical-underpinnings level. My first freenet impression from reading some articles gives me 90s internet vibes based on the common use cases they listed.

        I remember ceph because I ended up building it from the AUR once on my weak little personal laptop because it got dropped from some repository or whatever but was still flagged to stay installed. I could have saved myself an hours long build if I had read the release notes.

        • @rottingleaf@lemmy.world
          link
          fedilink
          English
          12 months ago

          My first freenet impression from reading some articles gives me 90s internet vibes based on the common use cases they listed.

          That’s correct, I meant the way it works.