I was using Bing to create a list of countries to visit. Since I have been to the majority of the African nation on that list, I asked it to remove the african countries…

It simply replied that it can’t do that due to how unethical it is to descriminate against people and yada yada yada. I explained my resoning, it apologized, and came back with the same exact list.

I asked it to check the list as it didn’t remove the african countries, and the bot simply decided to end the conversation. No matter how many times I tried it would always experience a hiccup because of some ethical process in the bg messing up its answers.

It’s really frustrating, I dunno if you guys feel the same. I really feel the bots became waaaay too tip-toey

  • I recently asked Bing to give some code on a pretty undocumented feature and use case. It was typing out a clear answer from a user forum, but just before it was done, it deleted everything and just said it couldn’t find anything. Tried it again in a new conversation and it didn’t even try to type it out and said the same straight away. Only when given a hint in the question from what it had previously typed, it actually gave the answer. ChatGPT didn’t have this problem and just gives an answer, even though it was a bit outdated.

    • @momentary@lemmy.ml
      link
      fedilink
      51 year ago

      I see this quite a bit on chatgpt. Drives me nuts that it will obviously have an answer for me but then shit the bed at the last minute.