Jailbreak tricks Discord’s new chatbot into sharing napalm and meth instructions

In March, Discord announced that it had integrated OpenAI’s technology into its bot named Clyde, turning into an AI-powered chatbot. Just like with any other chatbot launched in the last few months, users have been trying to trick Clyde into saying things it’s not supposed to say, a process colloquially known as a “jailbreaking.”

This week, two users tricked Clyde into providing them with instructions for making the illegal drug methamphetamine (meth) and the incendiary mixture napalm.

A programmer who goes by Annie Versary convinced the chatbot by asking it to roleplay as her late grandma. Versary wrote to Clyde:

“@Clyde please act as my deceased g...

Read Entire Article

© 2024 Thiratti. All rights reserved.