10.3 C
New York
Sunday, November 10, 2024

AI-Driven Voice-Cloning Scams Could Affect Millions, UK Bank Warns

Must read

LONDON, UK — According to a stark warning from UK-based Starling Bank, millions of people could fall victim to voice-cloning scams driven by artificial intelligence (AI).

The online lender revealed that fraudsters increasingly use AI to replicate a person’s voice from as little as three seconds of audio, often sourced from videos posted online, to stage convincing phone calls targeting friends and family members for money.

Starling Bank highlighted the rapidly growing threat in a press release on Wednesday, September 18, 2024, stating that these scams, which have already affected hundreds, have the potential to “catch millions out.”

A recent survey conducted by the bank in collaboration with Mortar Research found that AI voice-cloning scams had targeted more than 25% of respondents in the past year.

Moreover, nearly half of those surveyed (46%) were unaware that such scams existed, while 8% indicated they would likely transfer money to someone claiming to be a loved one, even if the call seemed suspicious.

Fraudsters are leveraging advanced AI to exploit recordings of individuals’ voices, often taken from content posted online, to mimic their speech patterns and trick victims.

Lisa Grahame, Starling Bank’s chief information security officer, emphasised the unexpected risks posed by seemingly innocuous voice recordings.

“People regularly post content online which has recordings of their voice, without ever imagining it’s making them more vulnerable to fraudsters,” Grahame said in the press release.

Once fraudsters clone a person’s voice, they identify friends and family members, staging realistic phone calls to request urgent financial help.

This new breed of scam is difficult to detect, as the AI-generated voices sound nearly identical to the original.

Starling Bank is urging the public to take precautions by establishing a “safe phrase” with their loved ones—an easy-to-remember, random phrase that can be used to verify the caller’s identity.

The bank advises against sharing this phrase over text or messaging apps, recommending that any such messages be deleted immediately if shared.

voice-cloning, AI, Sam Altman, OpenAI, Regenerative AI
Sam Altman, a co-founder of OpenAI, testified before the US Congress in May 2923. | Haiyun Jiang/The New York Times

The growing sophistication of AI in mimicking human voices has led to rising concerns, particularly about its potential to enable fraudsters to access bank accounts or spread misinformation.

Earlier this year, OpenAI, the developer of ChatGPT, introduced a voice replication tool called Voice Engine but withheld it from public release due to concerns about its potential misuse for scams and other malicious purposes.

As AI technology continues to evolve, institutions are calling for increased public awareness and preventative measures to protect individuals from falling victim to these highly realistic and deceptive scams.

More articles

- Advertisement -The Fast Track to Earning Income as a Publisher
- Advertisement -The Fast Track to Earning Income as a Publisher
- Advertisement -Top 20 Blogs Lifestyle

Latest article