• @nadir@lemmy.world
      link
      fedilink
      English
      44 months ago

      Read about this on the site of the garage project. they apparently wouldn’t be a thing without this funding.

      Recently set up a cluster and it’s great. Sad to hear this went through

    • @endofline@lemmy.ca
      link
      fedilink
      English
      34 months ago

      Well, they still can’t underfund gnu privacy guard :-) It’s pretty much already finished product and working pretty well.

      • @frezik@midwest.social
        link
        fedilink
        English
        34 months ago

        Eh? I tried it about a year ago, and I found all the same clunky problems that were there 20 years ago.

        • @endofline@lemmy.ca
          link
          fedilink
          English
          44 months ago

          clunky

          Thunderbird + kleopatra? K-9 + OpenKeyChain ( android )? Where did you have issues?

          • @frezik@midwest.social
            link
            fedilink
            English
            7
            edit-2
            4 months ago

            I went through an exercise with a few other developers to see if we could use it for transferring sensitive information. I was using Windows w/WSL2 at the time (now I’m full Linux for my work machine), and I believe the other two were on Macs.

            Our conclusions were that while it might be useful alongside other ways, it was too clunky to use in general. One of the more useful things we could do is have developers sign git commits.

            The email plugins for various clients make it easy to mistakenly think you’re sending an encrypted email. When even technical people are making this mistake, then it’s a big issue for widespread adoption. The plugins also don’t always send it in a format that works for every client out there. We found the most consistent way was to encrypt the message in a file and attach it to the email.

            The plugins don’t work with modern webmail, anyway.

            Public key servers are unreliable. They’re largely maintained by volunteers, so this is understandable, but we couldn’t recommend that the company use them. If we wanted reliability, we’d need to run our own internal keysever.

            Then there’s the key signing meetings we’d need to have. Even technical people find these a bother. These are, unfortunately, inherent to the web of trust model.

            I really wanted to make it work. The decentralized nature of the web of trust–as opposed to the hierarchical model of TLS–is appealing to me personally. But this shit hasn’t gotten better in 20 years, and at least some of it is unfixable.