Cyber Crime
The Dark Side Of AI: Scammers Use Voice Cloning To Trick Indians Out Of Thousands, 47% Of Phone Users Fall Prey
NEW DELHI: 47% of Indian phone users have experienced AI voice scams, making India the country with the highest number of people experiencing the scam, globally, as per a recent McAfee report.
The report highlights the extent to which AI is being used by scammers, especially in voice scams, and the ease of voice cloning due to sound-based AI models that require very little prompt. McAfee researchers found more than a dozen publicly available AI voice-cloning tools on the internet, which are both free and commercial and require only a basic degree of skill and competence to utilize.
Voice cloning with AI has become so realistic that the chance of a cybercriminal duping someone into turning over their money increases. These hoaxes are predicated on exploiting the emotional weaknesses inherent in intimate connections, and a scammer may gain thousands of dollars in a matter of hours. The study team’s overarching conclusion was that artificial intelligence has already altered the game for cybercriminals, and the barrier to entry has never been lower, making it easier to perpetrate cybercrime.
ALSO READ: Step By Step Guide: How To File Cybercrime Complaints Online In India
One of the major reasons why Indians are being targeted disproportionately is the fact that about 86% of Indian adults share their voice data online or in recorded notes at least once a week via social media, voice notes, and other means. Furthermore, the study found that 69% of Indian adults were unsure if they could tell the difference between an AI-based clone and a genuine person.
66% of Indian respondents stated they would respond to a phone or voice message claiming to be from a friend or loved one in need of money. However, the cost of falling for an AI voice scam can be significant, with 48% of Indians who lost money claiming it cost them more than INR 50,000.
McAfee CTO, Steve Grobman, advises individuals to stay vigilant and take proactive measures to keep themselves and their loved ones safe. One of the steps that individuals can take is to set a verbal ‘codeword’ with their children, family members, or trusted close acquaintances that only they will understand. They should always ask for it whenever they phone, text, or email for assistance, especially if they are elderly or fragile.
The report also highlights the growth of deepfakes and disinformation, which has made people more sceptical of what they see online, with 27% of Indian adults saying they are now less trusting of social media than ever before, and 43% concerned about the rise of misinformation or disinformation.
ALSO READ: Cyber Criminals Exploit ChatGPT’s Popularity To Spread Malware Via Facebook Accounts: CloudSEK
AI has made it easier for cybercriminals to perpetrate voice scams, and Indians are being targeted disproportionately due to the high percentage of Indians sharing their voice data online or in recorded notes. It is important for individuals to stay vigilant and take proactive measures to protect themselves and their loved ones from AI-based voice scammers.
Follow The420.in on
Telegram | Facebook | Twitter | LinkedIn | Instagram | YouTube