No, really, those are the magic words A clever AI bug hunter found a way to trick ChatGPT into disclosing Windows product keys, including at least one owned by Wells Fargo bank, by inviting the AI model to play a guessing game.…
You must log in or register to comment.
Hacking in the future is going to be so stupid
before i read this and realized “I give up” was the trigger to getting the key, I read the headline like “AI? i can’t even…”
Why is the model trained on real Windows keys in the first place?