Tech
Artificial Fraudulence: Lucknow man duped of Rs 45K by AI-cloned voice, second such instance in 2 weeks
More and more people in India are getting duped by scammers, who are using AI to generate a cloned voice sample of one of the victims relatives, and asking them to click on dubious links, or asking them for money
FP Staff December 19, 2023 14:46:51 IST
More and more people in India are getting duped by scammers, who are using AI to generate a cloned voice sample of one of the victims relatives, and asking them to click on dubious links, or asking them for money
In a troubling incident, a resident of Lucknow fell victim to a cyberthug who used artificial intelligence to impersonate the voice of the victim’s relative.
The fraudster manipulated the victim, Kartikeya, into transferring Rs 45,000 under the guise of a financial transaction. However, he was duped into sharing the credentials of his bank account.
Lucknow police are investigating the case.
Related Articles
Pope Francis calls for global AI treaty for ethical use
OpenAI is making artificial intelligence that’s good at maths. Here’s why that’s significant
Kartikeya, a resident of Vineet Khand, which falls in the Gomtinagar police station jurisdiction, received a call from an unknown number, where the caller claimed to be his maternal uncle.
The impersonator explained that he was in the process of transferring Rs 90,000 to someone known to him but was facing issues with the transaction through his UPI.
Deceptively, the fraudster instructed Kartikeya to send ₹45,000 to his account, posing as a solution to the apparent UPI problem. Trusting the caller, Kartikeya complied with the advice and transferred the specified amount.
Soon after, Kartikeya received several SMSes stating that two sums of Rs 10,000, one of ₹30,000, and ₹40,000 had been credited to his account. Believing that the money had been successfully credited, and that he was payed back, Kartikeya later discovered that the funds were not present in his account.
Fortunately, multiple transactions failed, and only ₹44,500 could have been potentially transferred out of his account. Realizing the fraud, Kartikeya promptly reported the incident to the police. An FIR has been filed, and an investigation is underway to apprehend the perpetrator.
Second such case in two weeksThis incident in Lucknow follows a similar case in Delhi, where cybercriminals used AI-based voice cloning to extort money from an elderly man, Lakshmi Chand Chawla, of Yamuna Vihar,
In this instance, scammers impersonated the voice of the victim’s cousin’s son, claiming a kidnapping and coercing the victim to transfer ₹50,000 via Paytm.
The criminals created a realistic voice recording of the child, leveraging AI voice cloning technology to deceive the victim. When Chawla contacted the family, asking about the kidnapping, he was told that there had been no such incident. It was at this point, that he realised what had happened.
AI Voice clones on the riseWhile this is a fairly new phenomenon in India, scammers have been using AI-vloned voice to dupe people for over a year now. One of the first cases involved a woman in Arizona, US, who received a message claiming that her daughter was kidnapper. However, before money could change hands, the woman in question got to know that this was a scam.
In light of this new scam, there are a few things that one needs to keep in mind in order to protect themselves.
Check the phone number thoroughly before answering a call: Most scammers are using numbers from Vietnam, Sudan and Malaysia to place such calls, so unless you regularly receive calls from these areas from legitimate contacts, avoid answering them. Moreover, avoid picking up calls from unknown numbers, if you can.
Screen your calls: There are plenty of ways to screen calls from unknown numbers. Some phones now come with AI assistants that will take a call for you, screen them, and then let you choose if you want to take the call, so if you have that option, use it. Otherwise, you can always text the number saying that you can’t take calls right now and that they need to text or better yet send a message on WhatsApp or Telegram.
Be wary of what links you click on: Be wary of the links that you receive, especially from those who are not saved as part of your contact list. Scammers count on their victims to click on these links, which will then plant a malware on your phone, and send across all sorts of vital information