AI voice manipulation new trend for money transfers

Scamsters impersonating voice of friends, kin to lure victims

featured-image

:In a disturbing new trend in cybercrime, scammers are using artificial intelligence (AI)-driven voice manipulation to deceive victims. In the latest case, a victim attempting to send money to a friend living in the US was duped into transferring a sum of ₹1.8 lakh by a scammer using an AI-generated voice.

According to a police source, these cases usually fall under the category of impersonation since the accused pretends to be someone else. However, the use of AI to talk to victims by posing as their friend or relative adds a new dimension. In this case, the victim received a call purportedly from his friend's WhatsApp number, requesting urgent financial assistance and asking him to transfer the amount to an acquaintance.



Behind the scenes, the voice was manipulated using AI technology, making it sound identical to the friend's voice. This form of impersonation is gaining traction among cybercriminals, who trick people into believing they are speaking to someone they know personally, police said. Despite being cautious of cybercrimes, the victim thought he was helping his friend.

The fraudsters exploited the trust between the victim and the impersonated person. In another twist, the victim was advised to seek assistance from bank employees. He contacted the customer care unit of a bank for help with the transaction.

However, this number turned out to be another impersonation, and the so-called 'customer service representative' scammed him as well, police said..