Blackmailers are using deepfaked nudes to bully and extort victims, warns FBI

The FBI has issued an advisory warning of an “uptick” in extortion schemes involving fake nudes created with the help of AI editing tools.

The agency says that as of April this year, it’s received an increasing number of reports of such “sextortion” schemes. Malicious actors find benign images of a victim on social media then edit them using AI to create realistic and sexually-explicit content.

“The photos are then sent directly to the victims by malicious actors for sextortion or harassment,” writes the agency. “Once circulated, victims can face significant challenges in preventing the continual sharing of the manipulated content or removal from the internet.”

© 2024 Thiratti. All rights reserved.