Horrifying Phone Call
A mom says criminals used AI to clone the voice of her 15-year-old daughter to fake a kidnapping and try to get a ransom — a shocking incident that underlines just how far the technology has come and how easily it can be abused.
It all began when the mother, Jennifer DeStefano, got a call from a mysterious number.
“I pick up the phone and I hear my daughter’s voice, and it says, ‘Mom!’ and she’s sobbing,” DeStefano told Arizona-based news station WKYT. “I said, ‘What happened?’ And she said, ‘Mom, I messed up,’ and she’s sobbing and crying.”
It was clearly a horrifying exchange. According to the mother, a man’s voice said: “put your head back, lie down.” The man warned DeStefano that he would “pop her so full of drugs” if she were to call the cops.
“I’m going to have my way with her and I’m going to drop her off in Mexico,” the man said, disgustingly.
After DeStefano balked after being asked for a $1 million ransom, the man reportedly lowered his demand to $50,000.
Happy Ending
The incident ended just as abruptly as it started. It didn’t take much for DeStefano to confirm her daughter was safe in her room — and extremely confused.
The tech the criminals used to clone the teen’s voice clearly was extremely convincing, she says.
“It was completely her voice,” DeStefano told WKYT. “It was her inflection. It was the way she would have cried. I never doubted for one second it was her. That’s the freaky part that really got me to my core.”
The fake kidnapping highlights a troubling new emergence of criminals making use of powerful AI cloning tools to mimic not only the speech but even the individual mannerisms of their victims.
To protect yourself, experts have some fairly straightforward advice.
“You’ve got to keep that stuff locked down,” FBI special agent Dan Mayo told WKYT, explaining that anybody with a big online presence could see it used against them.
Another simple way is to ask the caller about things that they couldn’t possibly know about the victim, something that could allow you to “find out real quick that it’s a scam artist,” Mayo said.
Nonetheless, according to the agent, AI scam calls like these are an almost daily occurrence — and sometimes, the criminals get away with it, too.
More on AI voices: Voice Actors Outraged That Businesses Are Using AI to Steal Their Voices