• umami_wasabi
    link
    fedilink
    English
    1810 months ago

    If get into smartphones, it became a mega cookie to track every single person, unless it can be deactivated.

    I won’t be surprise that some SNS will require this soon for uploading media when the adoption is wide enough.

  • @woelkchen@lemmy.world
    link
    fedilink
    English
    1110 months ago

    Sony will look to roll this out in Spring 2024 with its next wave of premium cameras, however, it’s not clear if there are plans to bring this to its flagship phones.

  • @GenderNeutralBro
    link
    English
    2
    edit-2
    10 months ago

    Anyone played with the C2PA spec at all? I see there’s an open source tool for working with the spec, creating, verifying, and appending signatures. https://opensource.contentauthenticity.org/docs/c2patool/

    I have not heard of any viewer applications that supports the spec, though. Do any web browsers have it? All I found with a quick search was this unmaintained draft Chrome extension: https://github.com/serelay/c2pa-web

    I guess support will come as more outlets start using it.

    Anyway, after spending a little time reading up on the C2PA spec, I think I get it, and it’s not as dumb as I originally thought. I tend have a viscerally negative reaction to any “standard” backed by Adobe, but this one doesn’t seem corrupt or anti-consumer.

    A few key points:

    1. It’s an open spec, so it can work with free software.

    2. It’s entirely optional.

    3. This feature is rolling out on the high end, because it is primarily useful to professionals.

    4. Trust in the photographer (as a person) remains crucial in legitimate journalism.

    If you want to embed a line of custody into your photos, that’s what this is for. If you don’t, you don’t need to. As the photo changes hands, everyone has the option of stripping that metadata, just like they can now with EXIF data.

    Personally I would probably not use this, because I’m not a professional and I don’t generally care to prove that any photos I take are authentic. I would prefer to err on the side of privacy myself, and NOT attach my name to every photo I take, the same way I currently disable GPS coordinates in EXIF tags in my phone’s camera app.

    I figure this will be most useful in professional settings such as journalism. If you are a professional photographer, you likely do want to prove that the images you publish were only edited by you. You want people to know whether a photo is authentically yours. If you are a media outlet, you want to continue that chain, both to confirm the photos you’re receiving are authentic, and to prove to your readers that any changes you make are authentic.

    AI gets the headline of course, but C2PA is more about proving authorship than it is about reality – whether that authorship is by camera, by AI, or by Photoshop or whatever. It’s cool that photos taken with Sony’s camera have a stamp saying it’s from a Sony camera, but ultimately it’s the photographer’s signature on it that matters more.

    I mean, photographers have always been able to lie with cameras, and that’s not going to change. You are trusting the photographer, not the photo. It’s worth as much as that person’s signature, no more.

    Ideally, that signature will be impossible to forge. Realistically, I would assume that at least some cameras will be hacked eventually. But it should be extremely difficult, as it is with e.g. the iPhone’s secure enclave. Trust in that signature should not be 100% but it should still be high, at least until a given camera model has a known exploit.

    • ijeffOPM
      link
      fedilink
      English
      210 months ago

      It kept making too many errors!

  • @xia
    link
    English
    110 months ago

    Someone will make the weirdest AI image… displayed on the most photorealistic Sony monitor… with an “authenticating camera”. Proof! … for anything!

  • @pbbananaman@lemmy.world
    link
    fedilink
    English
    -410 months ago

    Just take a picture of your manipulated picture/video from the Sony phone. This does not guarantee anything of value.

    • @bitsplease@lemmy.ml
      link
      fedilink
      English
      710 months ago

      With your current phone, go ahead and take two pictures - one normal, the other a picture of a picture of that thing

      Now look at the two, and tell me you can’t tell in a split second that one is a picture of a picture. There’s a reason that it’s a running joke on the internet that people need to learn to take real screenshot instead of taking a picture of the monitor - there’s always annoying and obvious artifacts.

      • @pbbananaman@lemmy.world
        link
        fedilink
        English
        010 months ago

        You’re imagining a future where screen resolution doesn’t improve and lenses can’t solve these issues? Are people really this short sighted?

        • @bitsplease@lemmy.ml
          link
          fedilink
          English
          110 months ago

          Except smart phone cameras will also improve - if anything, I’d say that over the last decade, the average smart phone camera has improved at a much faster rate than your average computer monitor.

          Combine what I said with all the other Metadata that will be collected, and I’m quite skeptical that you could fool an actual professional with your scheme.

      • @pbbananaman@lemmy.world
        link
        fedilink
        English
        110 months ago

        You could easily create a package that couples the authenticated device with a screen showing the faked images and bring that around. If there is a market for inauthentic images that appear authentic, people will easily bypass this technology.

        • @Ilovethebomb@lemm.ee
          link
          fedilink
          English
          310 months ago

          With the resolution of modern phone cameras, don’t you think you’d be able to literally see the pixels? Besides, phone cameras usually can’t focus that close.