Scammers cloned billionaire Sunil Mittal's voice with AI to con his executive: Report

Scammers tried to clone telecom czar Sunil Bharti Mittal's voice to scam his executive in Dubai to transfer money.

featured-image

Scammers often use AI voice clones to con commoners to transfer money or disclose financial information but it's rare when they try to impersonate a billionaire. This is exactly what scammers tried to do by cloning telecom czar Sunil Bharti Mittal's voice to scam his executive in Dubai to transfer money. Fortunately, the executive was smart enough to realise Mittal won't ask for such a huge money transfer and the scam was stopped in its tracks.

On Monday, speaking at the NDTV World Summit, Mittal cited the incident to caution people about the risks posed by misuse of emerging technologies like AI. Mittal recounted how the said executive stationed in Dubai received a fraudulent call that seemed to mimic his voice and tone and directed that a large fund transfer be done. The official who was vigilant and "sensible" immediately realised it was a scam, Mittal said admitting that when he heard the voice recording himself he was completely "stunned" as "it was perfectly articulated just as I would speak".



"And anyone who would not have been vigilant may have done something about it," Mittal said and warned that in future misuse of technology would enable fraudsters to go a step ahead and misuse digital signatures, even replicate faces on zoom calls to perpetrate such acts. "We'll have to protect our societies from the evils of AI, and yet we have to use the goodness of AI, because those companies, and nations that will not adopt AI will be left behind. So this is a conundrum for every time you get a new technology into place, there are pluses and minuses.

I remain very optimistic about the benefit of AI that the human race will achieve and be able to do jobs which are otherwise very difficult to perform," Mittal said. There are multiple instances of fraudsters nudging victims to click on malicious links, or using AI deepfakes and voice cloning for scams. The sophisticated web of AI-powered deceit makes it harder to spot online scams, and fraudsters have been known to clone and mimic a person’s voice from even short audio clips scraped from video a person may have uploaded online.

Scammers then leverage the AI-cloned voice to pose as the person and demand money from friends and family. At the same time, scammers are also using 'digital arrest' modus operandi, where they place audio or video calls, falsely pose as law enforcement officers, and use online intimidation to confine victims to their homes for extortion. These elaborate and sophisticated scams involve cybercriminals using fake documents, replicating virtual courtroom or police stations as a backdrop, to place victims under 'digital arrest'.

Recently, SP Oswal, chairman and managing director of Vardhman Group was defrauded of ₹7 crore by a gang that posed as officials from various government agencies. Published - October 22, 2024 09:16 am IST Copy link Email Facebook Twitter Telegram LinkedIn WhatsApp Reddit Kamala Harris deepfake shared by Elon Musk raises concerns about AI in politics Truecaller introduces AI call scanner for Android users OpenAI denies copying Scarlett Johansson’s voice for Sky EU demands clarity from Microsoft on AI risks in Bing technology (general) / internet / Artificial Intelligence / cyber crime.