• DominusOfMegadeus
    link
    fedilink
    English
    -386 months ago

    You’re right. I had an idea to regulate without completely eliminating, but that’s obviously crazy talk.

    • @GoodEye8@lemm.ee
      link
      fedilink
      English
      266 months ago

      You do know the R in GDPR literally stands for Regulation? There’s already a regulation that chatGPT should follow but deliberately doesn’t. Your idea isn’t to regulate, it’s to get rid of regulation so that you could keep using your tool.

      • @laurelraven@lemmy.blahaj.zone
        link
        fedilink
        English
        26 months ago

        Sounded more like enforcing the regulations without destroying the company or product to me, which I would have assumed was the preferred avenue with most regulations

        • @GoodEye8@lemm.ee
          link
          fedilink
          English
          56 months ago

          Agree to disagree. Regulations exist for a purpose and companies need to follow regulations. If a company/product can’t existing without breaking regulations it shouldn’t exist in the first place. When you take a stance that a company/product needs to exist and a regulation prevents it and you go changing the regulation you’re effectively getting rid of the regulation. Now, there may be exceptions, but this here is not one of those exceptions.

          • @laurelraven@lemmy.blahaj.zone
            link
            fedilink
            English
            16 months ago

            I mean, sure, if that’s what someone is saying, but I didn’t see anyone suggest that here.

            Companies violating regulations can be made to follow them without tearing down the company or product, and I’m absolutely not convinced LLMs have to violate the GDPR to exist.

            • @GoodEye8@lemm.ee
              link
              fedilink
              English
              16 months ago

              That’s a matter of perspective. I took the other persons comments as “Don’t take away my chatGPT, change the regulations if you must but don’t take it away”, which is essentially the same as “get rid of regulation”.

              Realistically I also don’t see this killing LLMs since the infringement is on giving accurate information about people. I’m assuming they have enough control over their model to make it say “I can’t give information about people” and everything is fine. But if they can’t (or most likely won’t because it would cost too much money) then the product should get torn down. I don’t think we should give free pass to companies for playing stupid games, even if they make a useful product.