AI voice scams on the rise: ‘Scammer’s voice was exactly like my nephew’s’

18 January,2024 06:40 AM IST |  Mumbai  |  Shirish Vaktania

AI-generated voice simulates relative’s tone, putting pressure on victim to transfer money immediately; here are some precautions you can take

Representation pic


Key Highlights

Subscribe to Mid-day GOLD

Already a member? Login

For unlimited access to all the articles

Scamsters have started using Artificial Intelligence (AI) technology to dupe their unsuspecting victims, as seen in a recent incident. A 70-year-old man from Andheri was cheated out of Rs 3.7 lakh. The scamster used AI to make a cloned voice call pretending to be the man's nephew residing in the United States. In the cloned voice of the complainant's nephew, the fraudster claimed that he had been kidnapped and his passport would be destroyed unless he paid up.

The complainant in the case, Jaswant Singh Kalsi is an electrical contractor residing at J B Nagar in Andheri East. He informed the police that the scamster cloned the voice of his nephew, Jeevan Singh, in order to get him to transfer money to a specified account. The Andheri police registered an FIR against the unknown scamster and during the investigation, discovered that the Rs 3.7 lakh was transferred to an account that originated in Delhi, after which it was transferred to multiple accounts within minutes.

According to the police, the incident occurred on January 15 when Kalsi received a call from an unknown number and the caller sounded like his nephew Jeevan. The caller claimed to be Jeevan and said he had sent Rs 10 lakh to Kalsi's account. He instructed Kalsi not to inform his father about the money. He then mentioned that he would be coming to Mumbai on January 22. A few minutes later, the scamster called again, this time pretending to be from the bank and claiming that Rs 10 lakh had been received.

1. Jaswant Singh Kalsi receives a call from a scamster claiming to be his nephew (Jeevan) living in the US, saying he is transferring R10 lakh to his bank account; 2. Kalsi receives another call from scamster posing as Jeevan and transfers R3.7 lakh to specified account, as told; he also sends a screenshot of transaction to Jeevan; 3. Jeevan, who was in a meeting at the time of transaction, checks his phone and contacts Kalsi saying he didn't send him any money and hadn't asked for the cash; 4. Kalsi realises he has been duped of R3.7 lakh using AI technology, after which he approached the police to file a complaint. Illustrations/Uday Mohite

Recalling the incident, Kalsi said, "A few minutes later, I received another call from an unknown international number. The caller claimed to be my nephew Jeevan and said he had been kidnapped by a person named Jagmohan, who had also taken his passport. The scamster spoke with a voice similar to my nephew's and it sounded just like how he speaks with me daily. He informed me that Jagmohan had taken his passport and he wouldn't be able to return to Mumbai on January 22."

Still speaking to Kalsi in his nephew's voice, the scamster then told him to transfer Rs 3.7 out of the Rs 10 lakh he had transferred to his account earlier. "I sent the money to the specified account and also sent the transfer receipt to my nephew's mobile number. Before sending the money, I tried calling my nephew, but he rejected my calls," Kalsi explained.

A few hours later, Jeevan called Kalsi and inquired about the receipt he had sent. Kalsi explained the entire situation after which Jeevan clarified that he had not spoken to him as he was busy in a meeting and hence rejected his calls. Kalsi then informed the police who discovered that the money had been transferred to multiple accounts and withdrawn.

Ritesh Bhatia, cyber security expert and Prashant Mali, advocate

Using AI, it is possible to clone people's voices and even use them in different languages or convert them into conversations. Many YouTubers and social media influencers utilise this technology to create content using the voices of celebrities, politicians, and others. While this technology also facilitates grammar checks and other useful features, scamsters are now exploiting it to cheat people.

A police officer stated, "We have registered an FIR under IPC sections 419 and 420, and various sections of the IT Act against the unknown scamster. It's a new thing for us and we are seeking the assistance of the cyber police to apprehend the accused."

Deepfakes and other misuse of AI

With the use of AI, people have been creating ‘deepfakes' wherein people's faces and voices are used to create video and audio content. Legendary cricketer Sachin Tendulkar was also a victim of a deepfake video. On X (formerly Twitter), Tendulkar informed that a deepfake video of him promoting a mobile application is doing the rounds on social media.

"These videos are fake. It is disturbing to see rampant misuse of technology. Request everyone to report videos, ads and apps like these in large numbers. Social media platforms need to be alert and responsive to complaints. Swift action from their end is crucial to stopping the spread of misinformation and deepfakes," Tendulkar tweeted.

Earlier the Bollywood celebrities Rashmika Mandana, Alia Bhatt and Priyanka Chopra also raised their voice against deepfake videos.

ExpertSpeak

According to cyber security expert Ritesh Bhatia, social media profiles that are accessible to the public are used by scamsters to obtain personal information. "People post photos of events and tag them with messages like ‘thank you mamaji, jijaji, etc. Scamsters often use these to target people by creating deepfake videos and audios using AI technology."

Advocate Prashant Mali, cyber lawyer, Bombay High Court, said, "Deepfake audio is a reality and the exact same voice can be achieved using AI-based voice manipulation. If someone falls prey to such crimes, Sections 66C and 66D of the IT Act, 2000, are applicable along with IPC section 420. To avoid being cheated a call back to the same person or relatives/friends often helps. I feel people that people should always keep in mind the possibility of a deepfake audio or video as the technology has become an easily available tool for criminals."

‘POV' method for precaution

Bhatia says that people should adopt the ‘POV' method to be safe. "In this, ‘P' means pause: You have to pause and wait for it and don't do anything. ‘O' is for zero trust if you receive such kind calls. In such cases, don't trust the person and wait for more details. ‘V' is verify, with relatives and other sources. We are using this simple method to spread awareness. People easily trust people online and we are forgetting our basics. Never trust or entertain such calls."

70 yrs
Age of the man who was duped

"Exciting news! Mid-day is now on WhatsApp Channels Subscribe today by clicking the link and stay updated with the latest news!" Click here!
Artificial Intelligence mumbai mumbai crime branch mumbai crime news mumbai news
Related Stories