• General_Effort@lemmy.world
    link
    fedilink
    English
    arrow-up
    13
    ·
    1 day ago

    The New Mexico court heard how Meta’s 2023 decision to encrypt Facebook Messenger – its direct messaging platform, which predators have used as a tool to groom minors and exchange child abuse imagery – blocked access to crucial evidence of these crimes.

    Encryption! These monsters!

    In the next phase of the legal proceedings, due to begin on 4 May, the attorney general’s office will seek additional financial penalties and court-mandated changes to Meta’s platforms that “offer stronger protections for children”, said Torrez.

    The design feature changes the state is seeking include “enacting effective age verification, removing predators from the platform, and protecting minors from encrypted communications that shield bad actors”.

    And when that happens, the headline lemmings here will call it enshittification and call for even harsher rules.

  • Phoenixz@lemmy.ca
    link
    fedilink
    English
    arrow-up
    35
    ·
    2 days ago

    jury finds firm misled consumers over safety and enabled harm against users

    If I do something like this, I go to jail

    WHY THE FUCK IS ZUCKERBERG NOT IN JAIL?

    • luciferofastora@feddit.org
      link
      fedilink
      English
      arrow-up
      14
      ·
      1 day ago

      Because limited liability corporations were created to avert liability from individuals. His firm is liable, but no single individual within it.

      Not even the ones making the executive decisions, despite their near-monarchic power. I guess since they’re appointed by a board of directors, it’s something like an electoral monarchy, except the board isn’t democratically elected so it’s a plutocracy by proxy. The ultimate culprit would be - and this is a chorus you’ve probably heard a thousand times on here - the shareholders, and going after them is hard. Particularly when the shareholders are themselves corporations…

      But the CEO is the pin focusing shareholder intent down into decisions and ultimately action. If they were effectively held responsible for their decisions, it would at least provide some counterbalance to the shareholders’ demands. It could also solve the “shareholders are corporations” issue, since you could make the CEOs of those companies liable for demanding illegal measures from companies they control.

      Of course, such a drastic change would be hard to actually push through, as things stand, since it would inhibit (illegal) profit and growth and “the economy” is a sacred cow. It’s still worth pushing for, in my opinion, but building awareness and support takes patience and tact to avoid pushing people into political apathy.

      The alternative I could see (and would prefer, but suspect to be even less attainable) is to dismantle the stock and capital system entirely. What you’d replace it with is a whole separate debate I won’t cover in this comment. Drastic systemic change is difficult to plan and enact, and building and maintaining the new system is difficult in the face of insecurities, old habits, unforeseen challenges that it may not yet have developed effective ways to deal with and generally all the growing pains that come with new things.

      They’re not mutually exclusive, and the first may be a step on the road to the second. Either way, public support is key, and that is rarely won quickly.

        • luciferofastora@feddit.org
          link
          fedilink
          English
          arrow-up
          6
          ·
          1 day ago

          I get the meme, but it’s kinda dumb. This is a website where you’re free to just not read my comment, if you don’t wanna engage with the topic, not a captive audience like a retail employee.

          • polderprutser@feddit.nl
            link
            fedilink
            English
            arrow-up
            2
            ·
            18 hours ago

            I was just highlighting the juxtaposition in length and depth between the two comments by dropping a dumb meme one level deeper. I get that might come across as not taking this seriously, and I do apologise for that. 🙇🏼‍♂️

            I genuinely value your post. It makes sense, and it fills me with dread precisely because I don’t see this changing quickly for the better. I do hope empathy and basic human decency prevail in the long run.

            Absurdist humour is one of my coping mechanisms for exactly these kinds of topics, not a way to dodge them. This particular attempt may have overshot that mark a bit though.

            • luciferofastora@feddit.org
              link
              fedilink
              English
              arrow-up
              2
              ·
              13 hours ago

              I was just highlighting the juxtaposition in length and depth between the two comments by dropping a dumb meme one level deeper.

              I know, I get the meme. I just took it as inspiration for another wordy, serious comment, which I now realise continued the trend. I suppose the apt follow-up would have been some even shorter quip like “OK Boomer”. Instead, you had to make a serious reply of your own and break the chain. Thanks, Obama.

              I genuinely value your post.

              And I value your genuine response and explanation. We hope together.

              Absurdist humour is one of my coping mechanisms for exactly these kinds of topics

              That I can get behind. When confronted with the absurdity of our great ambitions and worries in face of our own insignificance, what else can we do but make memes?

              What better way to bear dark times than to make light of them?

              When life is serious enough, you don’t need to be.

              Live. Laugh. Shitpost.

    • BanMe@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      ·
      2 days ago

      You can’t put a shareholder in jail, they’re the entire point of the system gestures broadly

  • bitjunkie@lemmy.world
    link
    fedilink
    English
    arrow-up
    80
    arrow-down
    1
    ·
    2 days ago

    Putting this in fixed-width for scale:

    This ruling:                        375,000,000
    Meta valuation:               1,618,000,000,000
    

    This isn’t even a slap on the wrist; it’s a fucking rounding error.

    • 7101334@lemmy.world
      link
      fedilink
      English
      arrow-up
      54
      ·
      2 days ago

      Phrased in another way, it’s equivalent to if you had $1,618 in the bank and were fined $0.30.

    • rumba@lemmy.zip
      link
      fedilink
      English
      arrow-up
      15
      ·
      2 days ago

      Super small compared to their income, but a GREAT reason to make all the users age validate.

      • badgermurphy@lemmy.world
        link
        fedilink
        English
        arrow-up
        22
        ·
        edit-2
        2 days ago

        Fining companies that commit a crime a small portion of the money they gained by committing that crime is not progress, that is the problem here. Meta still made more money, after the fine, than if they had not perpetrated the crime. This is more of the status quo, which is why people are complaining about this the same as they had about the previous million times this same thing happened.

      • Seth Taylor@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        ·
        2 days ago

        Nah, this is a practice in America. John Oliver did an episode on it but I can’t remember for the life of me what the main topic was.

  • melsaskca@lemmy.ca
    link
    fedilink
    English
    arrow-up
    47
    ·
    2 days ago

    Good! Remember though, fines don’t count anymore, only hard time. Remove some years from these fuckers lives and they’ll think twice in the future.

    • Tenderizer@aussie.zone
      link
      fedilink
      English
      arrow-up
      3
      ·
      19 hours ago

      This lawsuit is about end-to-end encryption and the lack of age verification on Instagram. So not good.

    • Goodlucksil@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      19
      ·
      2 days ago

      Do I have to remind everyone the ending of The Wolf of Wall Street?

      Tap for spoiler

      Rich people go to ricb people prisons that aren’t really prisons and are better than your house.

      • BarneyPiccolo@lemmy.today
        link
        fedilink
        English
        arrow-up
        10
        ·
        edit-2
        2 days ago

        I don’t really care which prison they go to, as long as they also get leased out to do dirty, dangerous, back-breaking manual labor like every other Federal 13th Amendment Labor Slave. Grab that shovel, Inmate 4547.

          • BarneyPiccolo@lemmy.today
            link
            fedilink
            English
            arrow-up
            3
            ·
            2 days ago

            I get it. I remember reading that Ghislane Maxwell was “much much happier” with her new accomodations saying that the food was “legions better,” and staff was “responsive and polite.” Isn’t that nice for her?

            But the business model of American prisons includes 13th Amendment Slavery, so she might not be as happy with that twist.

      • UnderpantsWeevil@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        ·
        2 days ago

        I don’t begrudge rich people going to rich people prison, because the point of prison is to remove dangerous people from society not to torture them in a cage. I do begrudge poor people going to poor people prison, because it seems as though these prisons exist as a means of extracting cheap labor from poor and PoC populations. Or outright abusing them - mentally, physically, and sexually - because this kind of brutality generates political rewards.

    • BarneyPiccolo@lemmy.today
      link
      fedilink
      English
      arrow-up
      8
      arrow-down
      2
      ·
      2 days ago

      I don’t know, I’d sacrifice a few years if I knew I’d be released to my Trillion dollar fortune. I’d make that deal.

      They love their money as much as their freedom, each is worthless without the other, so take both.

    • UnderpantsWeevil@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      2 days ago

      fines don’t count anymore, only hard time

      I mean, you’re assuming this survives one of the eight million appeals the Facebook legal team is going to throw at it.

      But yes, by the time it works itself all the way up and down the appellate courts, I wouldn’t expect this $1.5T company to experience any legal penalties in excess of a few million dollars.

    • MyMindIsLikeAnOcean@piefed.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      18 hours ago

      If social media companies were required to moderate their content…if they were responsible for what’s posted…all problems would go away.

      As it stands bad actors use bots to stay one step ahead of automated moderation.

      • Boiglenoight@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        ·
        2 days ago

        Yes. It’s probably concerning if they are continuously fined, but unless there’s a mechanism that ensures that, this is likely just annoying and not meaningful.

  • Fredselfish@lemmy.world
    link
    fedilink
    English
    arrow-up
    181
    arrow-down
    1
    ·
    3 days ago

    So…it’s a fucking fine, which way less then he made by doing this. Until throw these fucks in jail this shit will continue.

    • staircase@programming.devOP
      link
      fedilink
      English
      arrow-up
      59
      arrow-down
      1
      ·
      edit-2
      3 days ago

      In the next phase of the legal proceedings, due to begin on 4 May, the attorney general’s office will seek additional financial penalties and court-mandated changes to Meta’s platforms that “offer stronger protections for children”, said Torrez.

      The design feature changes the state is seeking include “enacting effective age verification, removing predators from the platform, and protecting minors from encrypted communications that shield bad actors”.

      Unclear how age verification would play out with their Digital Childhood Alliance efforts.

            • paraphrand@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              1
              ·
              edit-2
              2 days ago

              The victims of child exploitation? Or the lawyers representing them? Or…? I’m not asking about vague general “save the children” stuff. I’m talking about this lawsuit and similar.

              • obre@slrpnk.net
                link
                fedilink
                English
                arrow-up
                6
                ·
                edit-2
                2 days ago

                Not the person you were responding to, but IMO it’s the defense attorneys / legal department working to ensure that the legal outcome is as beneficial to the corporations as possible, even if they “lose”. In this case the fine is a cost of doing business, not nearly enough to actually discourage malfeasance and the legal/ PR pivot to blaming encryption rather than their algorithms is something they hope will tee them up to be able to do even more massive surveillance in the near future.

    • Lost_My_Mind@lemmy.world
      link
      fedilink
      English
      arrow-up
      30
      ·
      3 days ago

      Until throw these fucks in jail this shit will continue.

      Which is exactly why that won’t happen. Our president is a pedophile. There’s a whole network of wealthy pedophiles who no longer have an island. The pedophiles are in power.

      • OwOarchist@pawb.social
        link
        fedilink
        English
        arrow-up
        19
        ·
        3 days ago

        who no longer have an island

        *who now have a different island that we don’t yet know about.

      • waddle_dee@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        2 days ago

        Or it could because it’s a civil case with no penal repercussions, because it’s a bloody civil case. For them to go to jail, the DOJ would have to file criminal charges against Meta. They won’t do that, not because of Pedo President, but because the DOJ has been too chicken shit since Enron to go after anyone else.

        • FudgyMcTubbs@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          ·
          edit-2
          2 days ago

          Y’all are gonna end up with a Pizza Gate situation. We need real leaders who will hire an effective DOJ to investigate and charge the monsters in a timely but just manner. We need a new viable party.

    • M137@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      edit-2
      2 days ago

      Do you leave out words on purpose? It’s one thing to misspell something but I just don’t understand how you managed to just not write one word in both of your sentences.
      And just for clarity, those words are “is” and “we”.

      • trackball_fetish@lemmy.wtf
        link
        fedilink
        English
        arrow-up
        3
        ·
        2 days ago

        Strangely I find myself doing this occasionally as well (although, not as bad). I sometimes wonder if it’s cognitive damage from covid or perhaps the long term result of spell check / predictive word technologies. Either way it’s somewhat concerning.

        • Fredselfish@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          edit-2
          2 days ago

          It’s a fucking combination of pre text making changes and Lemmy removing words after I post. Not sure if its the app I use or Lemmy in general. I keep forgetting to recheck the text after I post.

    • matlag@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      2
      ·
      2 days ago

      Until throw these fucks in jail this shit will continue.

      You think? Send Zuckerberg or any of these billioraires to jail.

      1.They will use their lawyers army to be moved to a for-profit prison. 2.They will buy that prison. 3.They will make changes inside, turning them into a resort for them, but an absolute shithole for all other prisoners, guards, etc.

      • moustachio@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        2 days ago

        Or they won’t be able to do any of that with their assets seized and they’ll serve their time. No reason to make up some nonsense defeatist scenario.

  • BarneyPiccolo@lemmy.today
    link
    fedilink
    English
    arrow-up
    24
    ·
    edit-2
    2 days ago

    Fine Zuckerfuck his entire net worth AND Meta. He’s poor now.

    Now, let’s take a look at Musk, Bezos, and Ellison.

      • BarneyPiccolo@lemmy.today
        link
        fedilink
        English
        arrow-up
        4
        ·
        2 days ago

        Which is too bad, because he is a legendary asshole who deserves as much disparagement and derision as possible. He’s an unrepentant MAGA Traitor and he’s captured about half of the media, including major news sources, which he is reconfiguring to be part of the Conservative Propaganda Machine.

        It all needs to torn from his grasp, his companies broken up, and his business dealings deeply investigated. I have no doubt he’ll be in prison by the time it’s over, which is where he and his equally Sociopathic son belong.

  • Puddinghelmet@lemmy.world
    link
    fedilink
    English
    arrow-up
    16
    ·
    edit-2
    2 days ago

    It says Google will already fight the lawsuit and zuckerberg wants to as well, lmao and he says he wants to protect children but he won’t even admit fault with victims? Asshole. There’s literally a docu about it: Molly vs the machines.

    The two companies probably have to pay more than 3 million dollars. In the next phase of the trial, the jury examines the so-called punitive damages. These are additional damages, intended as an additional penalty.

    And because of this instagram will also remove end-to-end encryption and add age-verification

    The New Mexico case also raised concerns that allowing teens to use end-to-end encryption on Instagram chats — a privacy measure that blocks anyone other than sender and receiver from viewing a conversation — could make it harder for law enforcement to catch predators. Midway through trial, Meta said it would stop supporting end-to-end-encrypted messaging on Instagram later this year.

    Regarding the encryption decision, a Meta spokesperson told CNN that, “very few people were opting in to end-to-end encrypted messaging in DMs, so we’re removing this option from Instagram in the coming months. Anyone who wants to keep messaging with end-to-end encryption can easily do that on WhatsApp.”

    https://edition.cnn.com/2026/03/24/tech/meta-new-mexico-trial-jury-deliberation

    In May, Judge Bryan Biedscheid is slated to hold a trial without a jury on the state’s claims that Meta created a public nuisance that harmed state residents’ health and safety. The state will ask Biedscheid to direct Meta to make changes to its platforms, including adding effective age verification and removing predators, it said Tuesday.

    https://www.msn.com/en-us/crime/general/meta-ordered-to-pay-375-million-in-new-mexico-trial-over-child-exploitation-user-safety-claims/ar-AA1ZkHhq

    • bitjunkie@lemmy.world
      link
      fedilink
      English
      arrow-up
      16
      ·
      2 days ago

      If you’re still using Meta spyware in 2026 and think you’re getting true E2E without a backdoor, I’ve got a bridge to sell you.

      • Puddinghelmet@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        5
        ·
        edit-2
        2 days ago

        How do they get the key? Isn’t that stored on me and my chatpartners literal phone? You can only get is by physically unlocking it? Show me technical proof? Meta says they only collect metadata, but the actual data is encrypted… ofc that guy lies but then we can drag him in front of a judge. And you’re right ruzzia also hacked meta recently by their linked devices or support bots… U got proof or just a hunch

        • locahosr443@lemmy.world
          link
          fedilink
          English
          arrow-up
          7
          arrow-down
          1
          ·
          1 day ago

          ‘Show me proof meta is a bad actor or I’ll just take their word they aren’t’

          I guess that’s an opinion to have…

        • borari@lemmy.dbzer0.com
          cake
          link
          fedilink
          English
          arrow-up
          13
          ·
          2 days ago

          Did you run gpg yourself to generate the key pair, then exchange pub keys with your chat partner? Or did Facebook generate the keys for you from within a closed source application?

          • Puddinghelmet@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            4
            ·
            edit-2
            2 days ago

            if it has a backdoor it’s literally not end-to-end encryption at least, and they say it is so… idk so they are literally breaking the law and we can fine them again?

            • borari@lemmy.dbzer0.com
              cake
              link
              fedilink
              English
              arrow-up
              12
              ·
              edit-2
              2 days ago

              You’re misunderstanding what end-to-end encryption is. If they have a copy of your private key, it’s still end to end encrypted. The alternative would be akin to a TLS termination proxy, where your device would encrypt a message using Facebooks public key, they decrypt message, store it, and then Facebook uses your chat partners public key to encrypt and send to them. You cannot send an encrypted message straight through to your chat partner.

              What I’m insinuating is that there’s no way to know if Facebook has a copy of your private key. The message is still end-to-end encrypted, it is encrypted by you using your chat partners public key, and passes through all of Facebooks infrastructure encrypted, until your chat partner receives and decrypts it. If Facebook stores the message, it’s stored encrypted. They can just decrypt it when subpoenaed or whenever they want bc they have the required private key.

              • Puddinghelmet@lemmy.world
                link
                fedilink
                English
                arrow-up
                4
                ·
                edit-2
                2 days ago

                Ooo mb you’re right yeah, also when you use backups I read… ok something to look into for myself to understand better fr, thanks for this comment btw

    • 7101334@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      edit-2
      2 days ago

      instagram will […] add age-verification

      Judge Bryan Biedscheid is slated to hold a trial without a jury on the state’s claims that Meta created a public nuisance that harmed state residents’ health and safety. The state will ask Biedscheid to direct Meta

      Listen, I cannot wait for the day that everyone stops using Meta products and Mark Zuckerberg is turned into longpork wagyu in his stolen-land Hawaiian bunker, but the latter statement does not seem to support the initial claim.

      I wouldn’t hold my breath for any changes which will meaningfully impact the profitability of Meta.

      • Tollana1234567@lemmy.today
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 day ago

        meta is too useful for russian propaganda being peddled on facebook for conservatives/gop. they arnt going anywhere.

  • deathbird@mander.xyz
    link
    fedilink
    English
    arrow-up
    87
    arrow-down
    1
    ·
    3 days ago

    “The design feature changes the state is seeking include “enacting effective age verification, removing predators from the platform, and protecting minors from encrypted communications that shield bad actors”.”

    Oh fuck right off.

    I’m sorry but this is a bad “think of the children” decision. There are limits to what Meta or any platform can do about bad actors at that size without structural changes.

    What might actually help: only show people content from groups and people that they follow, preferably in chronological order, rather than suggesting new groups and pages algorithmically all the time and thereby increasing the likelihood of children interacting with strangers on the Internet.

    And improve parental controls for children’s accounts. I’m sure there’s nothing currently giving a “parent” account high level control over a “child” account, but I’m happy to be corrected if I’m wrong.

    But also: require intercompatibility with other platforms and a standardized form of profile data export so people can leave Facebook but stay in touch with the people who still use it.

    • MyMindIsLikeAnOcean@piefed.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      18 hours ago

      Dude…installing Facebook Purity doesn’t protect you from child predators, what are you even talking about?

      Want to know how all social media could simply and easily protect minors, and everybody at large? Hire some fucking moderators. Every social media company should be required to use as many humans as it takes to moderate all content posted on their platforms…everybody problem would be reduced to near zero. What’s happening now is nobody works at META…except at the design, legal and coding level. If you’re a bad actor and you want to post…you use a bot to interact with an automated process, and you’re always one step ahead of the automated process.

    • lmmarsano@group.lt
      link
      fedilink
      English
      arrow-up
      20
      arrow-down
      1
      ·
      edit-2
      2 days ago

      And improve parental controls for children’s accounts. I’m sure there’s nothing currently giving a “parent” account high level control over a “child” account, but I’m happy to be corrected if I’m wrong.

      Parental controls already exist in every major OS, they suffice to restrict & monitor social media, and they go unused.

      A better solution might be for laws to provide parents resources & incentives to parent children’s online activity (including training to use resources they already have) & to provide children education in online safety & literacy. Decades ago, federal courts citing commission findings & studies recommended these alternatives as superior in effectiveness, meeting government duties to minimize impact on civil liberties, allocation of law enforcement resources, etc. For the permanent injunction to COPA, the judge wrote

      Moreover, defendant contends that: (1) filters currently exist and, thus, cannot be considered a less restrictive alternative to COPA; and that (2) the private use of filters cannot be deemed a less restrictive alternative to COPA because it is not an alternative which the government can implement. These contentions have been squarely rejected by the Supreme Court in ruling upon the efficacy of the 1999 preliminary injunction by this court. The Supreme Court wrote:

      Congress undoubtedly may act to encourage the use of filters. We have held that Congress can give strong incentives to schools and libraries to use them. It could also take steps to promote their development by industry, and their use by parents. It is incorrect, for that reason, to say that filters are part of the current regulatory status quo. The need for parental cooperation does not automatically disqualify a proposed less restrictive alternative. In enacting COPA, Congress said its goal was to prevent the “widespread availability of the Internet” from providing “opportunities for minors to access materials through the World Wide Web in a manner that can frustrate parental supervision or control.” COPA presumes that parents lack the ability, not the will, to monitor what their children see. By enacting programs to promote use of filtering software, Congress could give parents that ability without subjecting protected speech to severe penalties.

      I also agree and conclude that in conjunction with the private use of filters, the government may promote and support their use by, for example, providing further education and training programs to parents and caregivers, giving incentives or mandates to ISP’s to provide filters to their subscribers, directing the developers of computer operating systems to provide filters and parental controls as a part of their products (Microsoft’s new operating system, Vista, now provides such features, see Finding of Fact 91), subsidizing the purchase of filters for those who cannot afford them, and by performing further studies and recommendations regarding filters.

      Adult supervision, child education on online safety & literacy, parental controls & filters are more effective at less expense to fundamental rights. Governments know this & conveniently forget it.

      • deathbird@mander.xyz
        link
        fedilink
        English
        arrow-up
        2
        ·
        2 days ago

        OS level parental controls do not give a parent control over a child’s use of a social media platform, to the best of my knowledge. For example, how do you prevent a child from friending someone you don’t know on facebook, while still letting your child join a Facebook group for their soccer team? That kind of fine grain control needs to happen on the level of the platform. Universally blocking DMs to your child’s account from accounts they are not friended to needs to happen on the level of the platform. Etc.

        Universally preventing children from joining social media is also an option, or giving parents the tools to block their children individually from accessing known social media sites from hardware under the parent’s control is also an option, but neither of these are sufficient or without negative consequences. Blocking children from social media by law requires age verification to have any effect. Blocking access to certain websites on a hardware level encourages the child to use hardware outside their parents control, or else excludes them from a part of social life.

        Platforms need parental control tools as well, not just operating systems, and those tools need to be sufficient to allow a parent to have real control over what their child can access. I don’t think that will exist without legislation, because it is contrary to the platform’s financial interest.

        • lmmarsano@group.lt
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          1
          ·
          edit-2
          2 days ago

          OS level parental controls do not give a parent control over a child’s use of a social media platform

          A quick web search indicates they can filter/block content, restrict apps, report activity. Additional software can monitor communication (including social media) and alert guardians.

          However, the legal opinion wasn’t that parental control software is the best solution or only better solution[1], but that more effective alternatives (such as non-punitive laws promoting use of client-side parental controls) with less adverse impact exist than punitive laws limited in their enforceability by jurisdiction & that unnecessarily burden & deter (thus harm) free exercise fundamental liberties.[2] Client-side parental controls only affect their users without affecting everyone else. Unlike regulations on site operators, they work on content originating outside a law’s jurisdiction. Even at the time of that federal court decision, parental controls could screen dynamic content (eg, live chats) over any protocol.

          By far, the most appropriate answer is responsible adult involvement & supervision and the education of children to address motivation, coping, & responsible behavior.

          The internet is global. A key problem with any coercive law is their jurisdiction isn’t: just as 4chan.org can tell UK’s OfCom to go fuck itself, site operators beyond a law’s jurisdiction can tell its enforcers the same. Another issue is the compliance burden is harder on entrants than the dominant companies in the industry with more resources to afford to comply, thus deterring competition. Do we really want to make it harder to displace our current social media companies with alternatives?

          Communication alone rarely poses immediate danger: there’s usually a number of steps between the communication & actual harm where anyone can intervene. We can block or ignore unwanted communication & choose the information we disclose. Responsible people can guide their children on safety & control their access to the devices they give them.

          A while ago, when my uncle struck his kid for making an unauthorized payment through the kid’s tablet, I scolded him for creating the situation where the kid could do that instead of setting up a child account with parental controls. When I asked him how child abuse is more responsible than reading some shit designed for him to understand and pressing a few buttons to use the system exactly as designed to prevent this shit from happening, he quickly got the point and did that in about an hour. This shit ain’t hard.

          Better solutions already exist, they’re effective, and the solid recommendations governments already have to promote them effectively would work. Governments have largely chosen not to.


          1. The cited recommendations I mentioned elsewhere went beyond parental control software into areas such as the promotion of standards & the development of better standards in the industry. ↩︎

          2. Rather than accept any law, government has a duty to minimize compromises of fundamental rights in meeting its “compelling interests”. When government fails to prove that a law is the least adverse to fundamental liberties among alternatives that are at least as effective, that law must be rejected. ↩︎

          • deathbird@mander.xyz
            link
            fedilink
            English
            arrow-up
            1
            ·
            20 hours ago

            I think you’re still misunderstanding me, and the scope of existing solutions.

            It is not sufficient to have tools to monitor behavior, blocking whole websites is too crude. If those are the tools that you have you’re really just encouraged not to use them, or to overuse them. There’s no real in between.

            These platforms make desired choices. There are some things they decide not to do because it’s less profitable.

            Again, what if I want my kid to have a Facebook account so that he can coordinate with his soccer club independently, but I don’t want him to DM strangers or join strange groups? I want to facilitate independence, not have to look over his shoulder constantly, but still protect him from groomers. It is not hard platforms to enable this kind of control. It’s just not profitable.

    • Dr. Moose@lemmy.world
      link
      fedilink
      English
      arrow-up
      16
      arrow-down
      2
      ·
      2 days ago

      What actually might help: hold people who design these tools criminally liable. Everyone knows what they are doing but you can’t really say no to your employer because “don’t worry you’re not liable” so everyone continues on building the Torment Nexus.

      • deliriousdreams@fedia.io
        link
        fedilink
        arrow-up
        5
        arrow-down
        1
        ·
        2 days ago

        Are you suggesting that we should be able to criminally prosecute people who build end to end encryption software and tools? Or algorithms that find people you may know? Because that seems to be key to the Meta lawsuit as far as they are involved. That and the fact that Meta deliberately mislead the public about the safety of the website for kids. Because social media as it exists today isn’t really safe for children and a best the people responsible for that are the executives who made the decision to lie accountable.

        But your average programmer isn’t designing tools for the purpose of making kids less safe. They aren’t designing tools for the purpose of being addictive. And they aren’t designing tools for predators. They happen to have designed tools used by predators because of the flaws in the design and the fact that their executives found those flaws to be advantageous to their bottom line so they played them up. Leaned in if you will.

        It was literally part of the leak in 2021 that they had discovered that their algorithm had certain effects and the C-Suit literally went about making sure they could use that for monetary gain to keep people on the site and scrolling. Not just young users, but users of all ages.

        The main thing is that it’s really easy to social engineer on a social media website where people are encouraged to give out all kinds of information that can be used against them in social engineering attacks. That, combined with the addiction fostered there and the encrypted chat methods owned by Meta and used by quite a bit of the world en masse is what created this situation.

        • Dr. Moose@lemmy.world
          link
          fedilink
          English
          arrow-up
          4
          ·
          edit-2
          2 days ago

          There’s difference between making an encryption tool and hiring top psychologists to design abusive systems.

          • deathbird@mander.xyz
            link
            fedilink
            English
            arrow-up
            3
            ·
            2 days ago

            The “these tools” that the government is targeting include end-to-end encryption, and I am of course of the position that and end encryption is a good thing.

            When we talk about “abusive systems” we need to be very clear about what kinds of technology or system behaviors we are discussing, or else the government solution by default will usually be “well it can’t be only the platform that spies on you.”

          • deliriousdreams@fedia.io
            link
            fedilink
            arrow-up
            2
            ·
            2 days ago

            Have you read the whistle blower’s book? Or even just the exerpts from it that have been floating around for ages?

            I’m curious, because it’s clear to me that the C-Suit c-suite at Meta and companies like it absolutely do employ some really shitty people, but at the same time, that doesn’t mean you can paint the janitor with the same brush as the lean in woman who made her personal assistant but lingerie and model it in her home for her. Or tried to force another woman to cuddle with her while she was pregnant.

            So what I’m saying is, I don’t agree with the sentiment that everyone who works there is a power mad executive intent on algorithmic domination of the internet, and for at least some of the programmers in question a job is a job.

            I will say that is different if they know what’s going on and have the proper ability to make the decision to fight against such a thing.

            But I question where your line of complicity starts and ends here.

            I guess I’m also pointing out that part of what makes meta properties particularly attractive to pedophiles is the same thing that makes it attractive to other online criminals and it’s the encryption.

    • RecallMadness@lemmy.nz
      link
      fedilink
      English
      arrow-up
      2
      ·
      2 days ago

      Unfortunately can’t codify how platforms work soecifically into law.

      But you could possibly explicitly make companies liable for promoting “detrimental” content. Then define “promoting” as something like “surfacing content to a user beyond the reach of the users immediate network. Ie algorithmic suggestions or advertising”

      • deathbird@mander.xyz
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 days ago

        You can control a lot about how platforms specifically work with legislation. After decades of seeing how they function we are more than capable of accurately identifying what functionalities are deleterious to the well-being of people generally or children specifically, what should be under the control of the parents of children, and therefore we have a good idea of what we should legally require platforms to disable or otherwise put under the control parents.

        Again I propose as an example: having an account marked as a child’s account, with a designated parent account and making it so that if someone would attempt to add that child account as a friend or connected account on a social media network such an addition must be approved by the parent account.

    • ExLisper@lemmy.curiana.net
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      2
      ·
      2 days ago

      What might actually help: only show people content from groups and people that they follow, preferably in chronological order, rather than suggesting new groups and pages algorithmically all the time and thereby increasing the likelihood of children interacting with strangers on the Internet.

      You would simply have big groups like “I ❤️ New Mexico” where people will comment on the same posts and interact. If you would limit all the content including comments and likes to users someone personally follows without the ability to discover other users you would turn facebook basically into WhatsApp. It would definitely solve the issue but it would also make the platform look empty and kill it. Which would not necessarily be bad but sadly killing facebook is too radical for anyone to support.

      • deathbird@mander.xyz
        link
        fedilink
        English
        arrow-up
        2
        ·
        2 days ago

        That’s literally how Facebook used to function, and a big motivator as to why people joined it. People wanted to interact with people they knew.

        And this would not prevent people from making new connections.

        If you wanted to meet, I don’t know, hot singles in your area, you would actually have to talk to people who knew about the singles group on Facebook and have them share the link to the group with you. Or find it in the search bar if it’s public. You know, seek it out.

        Keep in mind that Facebook does not show you groups or people because it cares about the connections that you make. It just wants you to keep clicking. Your own desire to connect is more than sufficient to drive you to connection…with a search bar.

        • ExLisper@lemmy.curiana.net
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          2
          ·
          2 days ago

          That was then. Now is not 20 years ago. You can’t just take a platform like Facebook 20 years into the past and expect the business model to still function. It’s like saying that people used to buy newspapers so banning NYT from having a website is not a problem, they will just sell newspapers again.

          • deathbird@mander.xyz
            link
            fedilink
            English
            arrow-up
            2
            ·
            20 hours ago

            If Facebook as it is today is the only profitable way for it to function, it shouldn’t exist.

            That’s not the case of course. A disenshitified Facebook is possible, and would be profitable too, it just wouldn’t make literally all the money. It just wouldn’t exploit you and literally every single way it could manage. Zuckerberg and the other shareholders would have to tolerate lower profits, but they wouldn’t have zero.

  • ExLisper@lemmy.curiana.net
    link
    fedilink
    English
    arrow-up
    63
    ·
    2 days ago

    The jury ordered Meta to pay the maximum penalty under the law of $5,000 per violation, totaling $375m in civil penalties for violating New Mexico’s consumer protection laws.

    Meta: I guess I will only be able to spend $79.635.000.000 on my next useless venture.

    • Tenderizer@aussie.zone
      link
      fedilink
      English
      arrow-up
      2
      ·
      19 hours ago

      This lawsuit is about end-to-end encryption and the lack of age verification on Instagram. So not good.

  • XLE@piefed.social
    link
    fedilink
    English
    arrow-up
    23
    ·
    2 days ago

    Unfortunately, part of the court’s decision was that Facebook wasn’t surveilling people enough.

    The New Mexico court heard how Meta’s 2023 decision to encrypt Facebook Messenger – its direct messaging platform, which predators have used as a tool to groom minors and exchange child abuse imagery – blocked access to crucial evidence of these crimes.

    • kcuf@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 day ago

      Yes. My take is that meta and others want this lawsuit to happen this way because they can use it as an excuse for age verification and other tracking things going on ATM too. The fine is nothing to them, but this is justification require more user identification

      • XLE@piefed.social
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 day ago

        I was recently encouraged a little by a lawsuit Meta lost that was based on the fact they were knowingly collecting too much data on a minor. The obvious solution is they should be more responsible with what they have (and probably start removing it), but their ideal solution is probably more data collection + focusing their abuse on vulnerable people who aren’t legally protected from it.