AI Voice Cloning Scams Could Catch Millions Out – With Over a Quarter of UK Adults Targeted in the Past Year

Voice cloning scams – where fraudsters use AI technology to replicate the voice of a [...]The post AI Voice Cloning Scams Could Catch Millions Out – With Over a Quarter of UK Adults Targeted in the Past Year appeared first on FF News | Fintech Finance.

featured-image

September 19 2024 Voice cloning scams – where fraudsters use AI technology to replicate the voice of a friend or family member – could be set to catch millions out, according to new research released today. The data, from Starling Bank , found that over a quarter (28%) of UK adults say they have been targeted by an AI voice cloning scam at least once in the past year. Yet, nearly half of UK adults (46%) have never even heard of such scams, let alone know how to protect themselves.

. AI is giving fraudsters new ways to target people – they can now use voice cloning technology to replicate a person’s voice from as little as three seconds of audio, which can easily be captured from a video someone has uploaded online or to social media. Scam artists can then identify that person’s family members and use the cloned voice to stage a phone call, voice message or voicemail to them, asking for money that is needed urgently.



In the survey, nearly 1 in 10 (8%) say they would send whatever they needed in this situation, even if they thought the call seemed strange – potentially putting millions at risk. Despite the prevalence of this attempted fraud tactic, just 30% say they would confidently know what to look out for if they were being targeted with a voice cloning scam. Starling Bank has launched the Safe Phrases campaign, in support of the government’s Stop! Think Fraud campaign, encouraging the public to agree a ‘Safe Phrase’ with their close friends and family that no one else knows, to allow them to verify that they are really speaking to them.

Then if anyone is contacted by someone purporting to be a friend or family member, and they don’t know the phrase, they can immediately be alerted to the fact that it is likely a scam. With criminals utilising increasingly sophisticated methods to elicit money, financial fraud offences across England and Wales are on the rise. UK Finance found offences jumped by 46 per cent last year, and the Starling research found the average UK adult has been targeted by a fraud scam five times in the past 12 months Lisa Grahame, Chief Information Security Officer at Starling Bank, commented , “People regularly post content online which has recordings of their voice, without ever imagining it’s making them more vulnerable to fraudsters.

Scammers only need three seconds of audio to clone your voice, but it would only take a few minutes with your family and friends to create a Safe Phrase to thwart them. So it’s more important than ever for people to be aware of these types of scams being perpetuated by fraudsters, and how to protect themselves and their loved ones from falling victim. “We hope that through campaigns such as this we can arm the public with the information they need to keep themselves safe.

Simply having a Safe Phrase in place with trusted friends and family – which you never share digitally – is a quick and easy way to ensure you can verify who is on the other end of the phone.” When prompted as to what AI voice cloning scams entail, 79% of UK adults reported being concerned about being targeted – more so than HMRC / High Court impersonations scams (75%), social media impersonation scams (76%), investment scams (70%) or safe account scams (73%). Lord Sir David Hanson, Minister of State at the Home Office with Responsibility for Fraud, said: “AI presents incredible opportunities for industry, society and governments but we must stay alert to the dangers, including AI-enabled fraud.

As part of our commitment to working with industry and other partners, we are delighted to support initiatives such as this through the Stop! Think Fraud campaign and provide the public with practical advice about how to stay protected from this appalling crime.” To launch the campaign, Starling Bank has recruited leading actor, James Nesbitt, to have his own voice cloned by AI technology, demonstrating just how easy it is for anyone to be scammed. Commenting on the campaign, Nesbitt said “I think I have a pretty distinctive voice, and it’s core to my career.

So to hear it cloned so accurately was a shock. You hear a lot about AI, but this experience has really opened my eyes (and ears) to how advanced the technology has become, and how easy it is to be used for criminal activity if it falls into the wrong hands. I have children myself, and the thought of them being scammed in this way is really scary.

I’ll definitely be setting up a Safe Phrase with my own family and friends.” To find out more about the Safe Phrases campaign, visit https://www.starlingbank.

com/safe-phrases/ Lisa Grahame Starling Bank Starling Bank Crypto Fintech News News Fintech.