First AI Voice-Cloning Fraud in Madhya Pradesh

AI Voice-Cloning Fraud Strikes MP for First Time; Indore Play School Owner Loses ₹97,500

The420.in Staff
5 Min Read

Madhya Pradesh has reported its first confirmed case of cyber fraud involving AI-based voice cloning, after a middle-aged woman running a small play school in Indore was tricked into transferring her entire savings to fraudsters who convincingly mimicked the voice of her cousin, a Uttar Pradesh Police personnel.The victim, identified as Smita Sinha (name changed), lost ₹97,500, an amount that included her personal savings, funds meant for paying teachers’ salaries, and instalments for a loan taken to run the school. Police said the fraudsters used advanced voice modulation technology to replicate the voice of her cousin, who works with the UP Police emergency dial service.

Final Call: FCRF Opens Last Registration Window for GRC and DPO Certifications

 

The Fake Emergency Call

According to the complaint, the incident took place on the night of January 6 while Smita was at home with her husband and teenage daughter. She received a call from an unknown mobile number that closely resembled her cousin’s phone number. The caller introduced himself as her cousin and spoke in a voice that was indistinguishable from his.

Smita told the police that the caller claimed one of his close friends had been admitted to a prominent private hospital in Indore and urgently required life-saving cardiac surgery. He said he needed to arrange money immediately and sought her assistance to ensure the payment reached the hospital without delay.

To add credibility to the claim, the caller told Smita that he was transferring money to her digital payment account in small amounts and requested her to forward the same to the hospital through QR codes shared during the call. He explained that he was unable to make the payment directly due to “technical issues”.

One of the QR codes sent to her phone was registered under the name “Hema”, which matched the name of an actual doctor working at the hospital mentioned by the caller. This detail further convinced the family that the request was genuine.

“As the call went on, I received message alerts indicating that money had been credited to my account. My daughter, who is more familiar with digital payments, scanned the QR codes and transferred ₹97,500 in four transactions,” Smita told the police.

However, within minutes of the five-minute call ending, Smita checked her bank balance and discovered that not a single rupee had been credited. The transaction alerts, investigators later said, were fabricated to create a false sense of confirmation.

The following day, Smita contacted her cousin in Uttar Pradesh, who denied making any such call and confirmed that the mobile number used by the fraudster did not belong to him.

“The number from which the calls, messages and QR codes were sent has remained switched off since the incident,” Smita’s husband told the police.

Investigation Underway

A complaint was lodged at Lasudia police station in Indore on January 7. An FIR has been registered against unidentified persons under relevant sections of the Bharatiya Nyaya Sanhita (BNS), 2023, and the Information Technology Act, 2000. The case has been transferred to the cybercrime wing for further investigation.

Additional Deputy Commissioner of Police (Crime), Indore, Rajesh Dandotiya, said preliminary findings strongly point to the use of AI-driven voice cloning technology.

“This appears to be the first reported case of AI voice-cloning fraud not only in Indore but possibly in the entire state. Fraudsters are now using artificial intelligence to replicate voices with alarming accuracy, making such crimes difficult to detect,” Dandotiya said.

Investigators are also probing whether an earlier call received by Smita around three months ago—where an unknown caller had enquired about her play school before abruptly disconnecting—was part of the reconnaissance conducted by the fraudsters.

Cybercrime officials have urged the public to exercise caution while responding to urgent financial requests received over phone calls, even if the caller sounds familiar. Police have advised citizens to verify such requests through a second channel, such as a direct call or video interaction, before transferring money.

The case highlights the growing threat posed by AI-enabled cyber frauds and the urgent need for increased public awareness as well as enhanced digital safeguards.

About the author – Ayesha Aayat is a law student and contributor covering cybercrime, online frauds, and digital safety concerns. Her writing aims to raise awareness about evolving cyber threats and legal responses.

Stay Connected