Thousands scammed by AI voices mimicking loved ones in emergencies
Enlarge (credit: ArtemisDiana | iStock / Getty Images Plus)
AI models designed to closely simulate a person's voice are making it easier for bad actors to mimic loved ones and scam vulnerable people out of thousands of dollars, The Washington Post reported.
Quickly evolving in sophistication, some AI voice-generating software requires just a few sentences of audio to convincingly produce speech that conveys the sound and emotional tone of a speaker's voice, while other options need as little as three seconds. For those targeted-which is often the elderly, the Post reported-it can be increasingly difficult to detect when a voice is inauthentic, even when the emergency circumstances described by scammers seem implausible.
Tech advancements seemingly make it easier to prey on people's worst fears and spook victims who told the Post they felt visceral horror" hearing what sounded like direct pleas from friends or family members in dire need of help. One couple sent $15,000 through a bitcoin terminal to a scammer after believing they had spoken to their son. The AI-generated voice told them that he needed legal fees after being involved in a car accident that killed a US diplomat.