Article 6C371 FBI warns of increasing use of AI-generated deepfakes in sextortion schemes

FBI warns of increasing use of AI-generated deepfakes in sextortion schemes

by
Dan Goodin
from Ars Technica - All content on (#6C371)
face-in-silicon-1-800x574.jpg

Enlarge

The FBI on Monday warned of the increasing use of artificial intelligence to generate phony videos for use in sextortion schemes that attempt to harass minors and non-consulting adults or coerce them into paying ransoms or complying with other demands.

The scourge of sextortion has existed for decades. It involves an online acquaintance or stranger tricking a person into providing a payment, an explicit or sexually themed photo, or other inducement through the threat of sharing already obtained compromising images with the public. In some cases, the images in the scammers' possession are real and were obtained from someone the victim knows or an account that was breached. Other times, the scammers only claim to have explicit material without providing any proof.

After convincing victims their explicit or compromising pictures are in the scammers' possession, the scammers demand some form of payment in return for not sending the content to family members, friends, or employers. In the event victims send sexually explicit images as payment, scammers often use the new content to keep the scam going for as long as possible.

Read 9 remaining paragraphs | Comments

External Content
Source RSS or Atom Feed
Feed Location http://feeds.arstechnica.com/arstechnica/index
Feed Title Ars Technica - All content
Feed Link https://arstechnica.com/
Reply 0 comments