Scammers use AI to clone woman's voice, terrify family with fake ransom call: 'Worst day of my life'

An Arizona family claims scammers used AI to clone their daughter's voice in a fake ransom call, recalling the horrifying incident on 'Fox & Friends First'

An Arizona family is speaking out to raise awareness surrounding the dangers of artificial intelligence after they were allegedly targeted by a fake ransom call in which scammers used voice-cloning as bait.

Viral TikToker Payton Bock and her mother DeLynne recalled the harrowing incident during "Fox & Friends First," detailing how the "life-changing" scam impacted their family. 

"It was super scary," DeLynne told Todd Piro Tuesday. "My husband actually took the phone call, and I was outside. He came out with this man on speakerphone using all kinds of foul language, screaming and yelling, saying that my daughter had hit him in a vehicle accident situation."

GOOGLE CEO ADMITS HE, EXPERTS ‘DON’T FULLY UNDERSTAND' HOW AI WORKS

"It was just awful. It was awful and very believable," she continued. "I felt as though I could see a scam, [a] normal scam like on a phone, don't click this email, don't open that, but this was a whole different level."

Bock revealed details of the incident in a TikTok last week, claiming the scammer cloned her voice to plead with her parents to save her life after she was supposedly involved in a car accident with another individual. 

She said that the scammer made "her" appear to be crying and saying she didn't want to die, all while the perpetrator was cursing, saying "vulgar things," and had her "tied up in the back of his truck."

Despite the severity of the situation, Payton explained in the interview that her mother has been a flight attendant for 35 years, and she is very "level-headed" and knows what to do in crisis situations. 

But even her mother was adamant the incident wasn't a scam at that time - she said she absolutely knew, without a doubt, that the voice was her daughter's. 

ELON MUSK TO DEVELOP ‘TRUTHGPT’ AS HE WARNS ABOUT 'CIVILIZATION DESTRUCTION FROM AI

"I called the police, and they're saying this is possibly a scam situation. I said there is no way this is a scam. This is my daughter's voice," DeLynne said. "This wasn't just some person pretending… As mother, you know your daughter's voice, and this was my daughter."

DeLynne tried calling Payton several times during the incident, but at the time she was at work speaking to a client on the phone. 

It wasn't until the Phoenix Police Department called Payton that she answered the phone. 

"I called my mom, and I was like, 'Mom, I'm okay. I'm at work,' and then she was relieved, and I haven't heard like her sound like that at all ever in my life," she said. 

ALTERNATIVE INVENTOR? BIDEN ADMIN OPENS DOOR TO NON-HUMAN, AI PATENT HOLDERS

But she said she still doesn't know who the scammer could be as critical questions remain unanswered. 

She said the scam happened prior to her establishing her presence online. 

"I honestly have no idea because I never posted my life online like that before," Payton responded when asked about the identity of the perpetrator. 

After posting the viral clip on TikTok last week, Payton said it became clear her family was not the only victim to this potentially evolving threat driven by AI. 

"After I posted it on TikTok, like my mom said, I thought that we were like the only ones who kind of experience something like this," Payton said. "If you go to my video, like in the comments, it's happened to so many people. So it has to be this upcoming thing with AI, specifically, which I'm just not a fan of at all."

DeLynne noted that the frightening scam is becoming a growing trend, and that's why they wanted to speak out, to warn other families against falling victim to AI scammers. 

"I cried for two days. Like it was just the most real scary thing I'd ever experienced. Worst day of my life," DeLynne said. "It truly was life changing. It was that scary for me."

Data & News supplied by www.cloudquote.io
Stock quotes supplied by Barchart
Quotes delayed at least 20 minutes.
By accessing this page, you agree to the following
Privacy Policy and Terms and Conditions.