In one of the most audacious AI-powered scams reported in India, cybercriminals have leveraged deepfake technology to impersonate former U.S. President Donald Trump and trick hundreds of people into investing in a fictitious “Trump Hotel Rentals” scheme. The fraud, which has unfolded over several months, spans cities like Bengaluru, Tumakuru, Mangaluru, and Haveri — each reporting clusters of victims, with Haveri alone accounting for over 15 confirmed cases.
According to a detailed report scammers used digitally altered videos of Trump cloned using artificial intelligence to endorse what appeared to be a legitimate investment opportunity in Trump-branded hotels. The videos, circulated via YouTube and other social media platforms, guided viewers to a fraudulent mobile app that claimed to offer high-yield returns from property rentals and remote job options.
The visuals were persuasive. A confident Trump appeared to pitch the hotel scheme, citing economic potential and inviting viewers to invest modest amounts for “daily profits.” For many, especially in Tier-2 cities, the professional presentation and the allure of a global brand proved irresistible.
From ₹1,500 to Financial Ruin: Victim Narratives
Among the first victims to come forward was a 38-year-old advocate from Bengaluru, who shared his harrowing experience. “I saw the video in January,” he recounted. “It looked authentic. It offered daily returns and an opportunity to earn from home. I started with ₹1,500 and began receiving ₹30 daily, which made me trust the platform.” Like many others, he was gradually encouraged to invest more — eventually losing nearly ₹6 lakh.
This “return-bait” model — starting with small gains to build trust — mirrors the infamous pig butchering technique used in romance and crypto scams. Victims are groomed over time, made to believe their investments are growing, and then are urged to inject more funds until contact is cut and access is blocked.
Karnataka police say that many victims, like the advocate, delayed reporting due to embarrassment or the belief they could recover the money. This delay allowed the fraud network to expand its reach, exploiting regional vulnerabilities and low digital literacy.
Authorities on Alert, But Prosecution Remains Elusive
Law enforcement officials have acknowledged the complexity of the scam, particularly due to its cross-platform execution. “The fraudsters used sophisticated tools — AI, social engineering, and app-based redirection,” said a senior cybercrime officer in Bengaluru. “The challenge lies in tracing the origin of these deepfake videos and dismantling the backend of the mobile app, which appears to be hosted on overseas servers.”
Investigators are now collaborating with national cyber agencies and tech companies to trace the infrastructure supporting the scam. However, with the videos being embedded into short-form clips and the app frequently changing names and hosts, the trail remains murky.
Authorities have issued advisories urging citizens to avoid investing based on social media promotions and to be wary of any schemes using celebrity endorsements — especially when no official verification is available. Despite these efforts, the damage has already been done. The emotional and financial toll is heavy, especially for middle-income families who saw the scheme as a path to supplemental income in challenging economic times.
Also Read: Attention Startups! Showcase Your Smart Policing Solutions on India’s Biggest Stage
Conclusion: The New Face of Digital Fraud
This scam marks a chilling evolution in cybercrime where artificial intelligence is no longer a speculative threat but a concrete tool in the fraudster’s arsenal. As deepfake technology becomes more accessible, the potential for AI-enabled deception is growing exponentially.
The Karnataka scam is not an isolated incident but a warning shot. It underscores the need for stronger digital education, quicker investigative collaboration, and robust regulation of AI-generated content. Until then, the next scam could be just a video away.