Police investigation remains open. The photo of one of the minors included a fly; that is the logo of Clothoff, the application that is presumably being used to create the images, which promotes its services with the slogan: “Undress anybody with our free service!”

  • xigoi
    link
    English
    161 year ago

    How exactly does that prevent someone from uploading a fake video?

    • @sv1sjp@lemmy.world
      link
      fedilink
      English
      41 year ago

      The point is to know the time that a video has been uploaded as well as the previous and next videos from it for uses as security cameras, accidents in cars etc to be able to trust a video. (More information can be found on paper).

      • @taladar@feddit.de
        link
        fedilink
        English
        91 year ago

        Not even that. It only allows you to verify that the source is identical to (the potentially wrong information) that was claimed at the time of recording by the person adding that information to the block chain. Blockchain, as usual, adds nothing here.

        • @fiah@discuss.tchncs.de
          link
          fedilink
          English
          -21 year ago

          Blockchain, as usual, adds nothing here.

          it can add trust. If there’s a trusted central authority where these hashes can be stored then there’s no need for a blockchain. However, if there isn’t, then a blockchain could be used instead, as long as it’s big and established enough that everybody can agree that the data stored on it cannot be manipulated

          • nudny ekscentryk
            link
            fedilink
            English
            91 year ago

            but false, nonconsensual nudes are not collectible items that need to have their authenticity proven. they are there to destroy peoples’ lives. even if 99% of people seeing your nude believe you it’s not authnetic, it still affects you heavily

            • @fiah@discuss.tchncs.de
              link
              fedilink
              English
              51 year ago

              nonconsensual nudes are not collectible items that need to have their authenticity proven

              of course not, but that’s not what this comment thread is about. It’s about this:

              Ironically, in a sense we will revert back to the era before photography existed. To verify if something is real, we might have to rely on witness testimony.

              that’s where it can be very useful to store a fingerprint of a file in a trusted database, regardless of where that database gets its trust from

              • nudny ekscentryk
                link
                fedilink
                English
                -21 year ago

                sure, but again: why woudl anyone like to do that with consensual or nonconsensual nudes?

                  • nudny ekscentryk
                    link
                    fedilink
                    English
                    -11 year ago

                    it very much is:

                    OP: In Spain, dozens of girls are reporting AI-generated nude photos of them being circulated at school: ‘My heart skipped a beat’

                    parent reply: Thats why we need Blockchain Technology

      • nudny ekscentryk
        link
        fedilink
        English
        21 year ago

        yeah but the problem is mere existance of tools allowing pornographic forgery, not verifying whether the material is real or not