Microsoft is already loosening restrictions it recently placed on interactions with the Bing AI chatbot and said it’s going to start testing another option that lets users choose the tone of the chat, with options for Precise (shorter, more focused answers), Creative (longer and more chatty), or Balanced for a bit of both.
After repeated reports of strange behavior (like the time it split into multiple personalities and one of them offered us furry porn) and jailbreaks that did things like expose its secret rules, Microsoft Read Entire Article