SHARE

Scammers are increasingly relying on AI voice-cloning technology to mimic a potential victim’s friends and loved ones in an attempt to extort money. In one of the most recent examples, an Arizonan mother recounted her own experience with the terrifying problem to her local news affiliate.

“I pick up the phone and I hear my daughter’s voice, and it says, ‘Mom!’ and she’s sobbing,” Jennifer DeStefano told a Scottsdale area CBS affiliate earlier this week. “I said, ‘What happened?’ And she said, ‘Mom, I messed up,’ and she’s sobbing and crying.”

[Related: The FTC has its eye on AI scammers.]

According to DeStefano, she then heard a man order her “daughter” to hand over the phone, which he then used to demand $1 million in exchange for their freedom. He subsequently lowered his supposed ransom to $50,000, but still threatened bodily harm to DeStefano’s teenager unless they received payment. Although it was reported that her husband confirmed the location and safety of DeStefano’s daughter within five minutes of the violent scam phone call, the fact that con artists can so easily utilize AI technology to mimic virtually anyone’s voice has both security experts and potential victims frightened and unmoored.

As AI advances continue at a breakneck speed, once expensive and time-consuming feats such as AI vocal imitation are both accessible and affordable. Speaking with NPR last month, Subbarao Kambhampati, a professor of computer science at Arizona State University, explained that “before, [voice mimicking tech] required a sophisticated operation. Now small-time crooks can use it.”

[Related: Why the FTC is forming an Office of Technology.]

The story of DeStefano’s ordeal arrived less than a month after the Federal Trade Commission issued its own warning against the proliferating con artist ploy. “Artificial intelligence is no longer a far-fetched idea out of a sci-fi movie. We’re living with it, here and now,” the FTC said in its consumer alert, adding that all a scammer now needs is a “short audio clip” of someone’s voice to recreate their tone and inflections. Often, this source material can be easily obtained via social media content. According to Kambhampati, the clip can be as short as three seconds, and still produce convincing enough results to fool unsuspecting victims.

To guard against the rising form of harassment and extortion, the FTC advises to treat such claims skeptically at first. Often these scams come from unfamiliar phone numbers, so it’s important to try contacting the familiar voice themselves immediately afterward to verify the story—either via their own real phone number, or through a relative or friend. Con artists often demand payment via cryptocurrencies, wire money, or gift cards, so be wary of any threat that includes those options as a remedy.