A dangerous new jailbreak for AI chatbots was just discovered

Microsoft released details about a troubling new generative AI jailbreak technique that can bypass a chatbot's safety guardrails.
Read Entire Article

© 2024 Thiratti. All rights reserved.