Georgia mother received fake ransom calls where scammers used AI to impersonate her 22-year-old daughter’s voice

0
23
Georgia mother received fake ransom calls where scammers used AI to impersonate her 22-year-old daughter's voice



A Georgia mother has become the latest to face a shocking AI phone scam using her 22-year-old daughter’s voice to claim she was kidnapped and demand a $50,000 ransom for her safe return.

So-called impostor scams, where a fraudster impersonates someone to steal money, are the most common scams in the United States, causing Americans to lose $2.6 billion in 2022 alone, the Federal Trade Commission reported.

Debbie Shelton Moore received a six-minute phone call from what she thought was her daughter Lauren, 22, who lives apart from her.

‘It sounded just like him. It was 100 percent believable,’ Moore said. ‘Enough to almost give me a heart attack from sheer panic.’

The scam demanded money for the girl’s return – but she was safe the whole time and wasn’t kidnapped.

A Georgia mother has become the latest to face a shocking AI phone scam using her 22-year-old daughter’s voice to claim she was kidnapped and demand a $50,000 ransom for her safe return.

Debbie Shelton Moore (pictured right) received a six-minute phone call from what she thought was her daughter Lauren (pictured left), 22, who lives apart from her.

Newstimesuk.com previously reported that fraudsters can mimic a victim’s voice using just three-second audio, often stolen from social media profiles.

It is then used to call a friend or family member to explain that they are in trouble and need money urgently.

Shelton Moore initially thought Lauren had been in a car accident and was asking for help until he heard three male voices.

‘The man said, ‘Your daughter has been kidnapped and we want $50,000.’ Then they had her crying, like, ‘Mommy, Mommy’ in the background. It was his voice and that’s why I was totally scared,’ she told 11 Alive.

Shelton Moore gets even more nervous when he checks Lauren’s phone location and discovers she’s stalled on a highway.

‘I [was] Guess he’s in the back because he said, ‘We’re in the back of the truck.’

Fortunately, her husband – who works in cyber security – overheard the conversation and sensed something was up. He Facetimes Lauren, who confirms that he’s in no danger and reveals that his wife is being cheated on.

‘It was all kind of a blur because I was just thinking, ‘How am I going to get my daughter? How are we supposed to get him money?” he added.

They finally called the county sheriff’s office and they confirmed Lauren’s safety.

Shelton Moore initially thought Lauren had been in a car accident and heard three male voices pleading for help.

Fortunately, her husband – who works in cyber security – overheard the conversation and sensed something was up. He Facetimes Lauren, who confirms that he is in no danger and reveals that his wife is being scammed.

‘My heart was pounding and I was shaking,’ she recalled of the moment she got the call. ‘I shudder to think about it right now.’

Scam is something that has hit a surprising number of Americans. One in four people in the April McAfee survey said they had some experience with AI voice scams, and one in ten said they had been personally targeted.

‘I’m very well aware of scammers and scams and IRS scams and fake jury duty,’ Moore said. ‘But of course, when you hear their voice, you’re not going to think clearly and you panic.’

Police recommend having a ‘safe phrase’ that you and your family can use to prove it’s not fake.

Steve Grobman, McAfee’s chief technology officer, says the rise of accessible and sophisticated AI makes scams faster and easier to execute.

“One of the most important things to recognize with AI advances this year is bringing these technologies within the reach of many more people, including truly enabling scale within the cyberactor community,” Grobman warned.

‘Cybercriminals are able to use generative AI to fake voice and deepfake in ways that would have required much more sophistication.’

Watch out: AI technology is fueling voice cloning scams, experts warn (stock image)

Vice President Kamala Harris told CEOs of leading technology companies in May that they have an increased moral responsibility to limit the harm to society from their AI products.

Vonnie Gamot, head of EMEA at McAfee: ‘Advanced artificial intelligence tools are changing the game for cybercriminals. Now, with very little effort, they can clone a person’s voice and trick a close contact into sending money.

‘Artificial intelligence brings incredible opportunities, but with any technology there is always the potential for it to be used maliciously in the wrong hands,’ he added.

‘What we see today is the access and availability of AI tools that help cybercriminals scale their efforts in increasingly credible ways.’

How to avoid falling for a scam: advice from experts

McAfee has shared a series of tips to help prevent falling prey to AI fraudsters.

They…

1. Set a ‘codeword’ with children, family members or trusted close friends that only they know. Always make a plan to ask if they call, text or email for help, especially if they are older or more vulnerable.

2. Always question the source—if it’s a call, text, or email from an unknown sender, or it’s from a number you recognize, stop, pause, and think. Asking pointed questions can throw off a scammer.

For example, ‘Can you confirm my son’s name?’ or, ‘When is your father’s birthday?’ Not only can this catch the scammer by surprise, but they may have to reframe a new response, which can add awkward pauses to the conversation and create suspicion.

3. Don’t let your emotions take over. Cybercriminals are counting on your emotional connection with the person they’re impersonating to spur you into action.

Take a step back before answering. Does that really sound like them? Is it something they will ask you? Hang up and call the person directly or try to verify the information before responding.

4. Consider answering unexpected calls from unknown phone numbers It is generally good advice not to answer calls from strangers. If they leave a voicemail, it gives you time to reflect and contact loved ones individually to ensure their safety.



Read Full News Here

LEAVE A REPLY

Please enter your comment!
Please enter your name here