New Delhi, November 17, 2025 – In a landmark move to safeguard digital dignity, the Ministry of Electronics and Information Technology (MeitY) has issued a comprehensive Standard Operating Procedure (SOP) mandating the swift removal of non-consensual intimate imagery (NCII) from online platforms within 24 hours. Announced on November 11, 2025, the SOP marks India’s first uniform, victim-centric framework to combat revenge porn, deepfakes, and morphed content, addressing a surge in cybercrimes that have scarred countless lives.
The initiative stems from a July 2025 directive by the Madras High Court in the case of X v. Union of India (Writ Petition No. 25017/2025), where Justice N. Anand Venkatesh ordered MeitY to devise a “prototype” for victims facing the dissemination of private, non-consensual images. The court highlighted the harrowing ordeal of a female advocate whose intimate videos were circulated online, underscoring violations of Article 21’s right to privacy and dignity. MeitY’s SOP, submitted in October and now formalized, transforms this judicial nudge into actionable policy, aligning with the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021.
Algoritha: The Most Trusted Name in BFSI Investigations and DFIR Services
At its core, the SOP empowers victims with multiple, accessible reporting channels: direct complaints to platforms like Instagram or WhatsApp, the National Cybercrime Reporting Portal (NCRP) helpline at 1930, One-Stop Centres (OSCs) for women, or local police stations. Once reported, intermediaries must act decisively—removing or disabling access to NCII, which includes real intimate photos, videos, hidden-camera footage, or AI-generated deepfakes—within the strict 24-hour window. Non-compliance invites penalties under IT Rules.
To prevent resurgence, the procedure introduces hash-matching technology. The Indian Cybercrime Coordination Centre (I4C) under the Ministry of Home Affairs will maintain a secure “hash bank” of flagged content’s digital fingerprints, enabling platforms to auto-detect and block re-uploads across new URLs. The Department of Telecommunications (DoT) will coordinate with Internet Service Providers (ISPs) to throttle access to offending links, while Content Delivery Networks (CDNs) face mandates for immediate de-indexing. Search engines, too, must scrub NCII from results, even if the source is deleted.
MeitY oversees the ecosystem, ensuring seamless coordination among stakeholders. Law enforcement agencies (LEAs) are looped in for criminal probes under IPC Sections 354C (voyeurism) and 66E (privacy violation) of the IT Act. Victims receive confirmation of actions taken, fostering trust in a process often marred by delays and insensitivity.
The SOP’s real-world applicability shines through nine major categories of abuse it explicitly targets, each illustrated with scenarios that reflect the document’s victim-first design:
1. Private Photos Shared by an Ex
A young woman in Mumbai discovers her former partner has uploaded her private bedroom photos on X (formerly Twitter) after a bitter breakup. Traumatized, she reports directly to the platform’s grievance officer. Under the SOP, the intermediary must verify the complaint’s authenticity—typically via victim ID proof and a declaration of non-consent—and remove the images within 24 hours. I4C logs the hash, preventing mirror posts on Telegram channels.
2. Morphed Images Circulating on WhatsApp
A Delhi college student wakes to find her face crudely pasted onto a nude body, forwarded across 50+ campus WhatsApp groups. She dials 1930 on NCRP. WhatsApp, as an intermediary, is compelled to delete the morphed file from all visible threads and use perceptual hashing to flag identical variants. DoT blocks bulk forwarding IPs if needed, curbing viral spread.
3. Hidden-Camera Footage Posted Online
In Bengaluru, a shopper’s trial-room video—secretly recorded via a concealed phone—surfaces on a dark-web mirror site hosted via a CDN. The victim approaches her local OSC, which escalates to police and I4C. The SOP activates CDN takedown protocols; Akamai or Cloudflare equivalents must purge the file globally within hours, while LEAs trace the uploader under Section 66C of the IT Act.
4. Revenge Porn Uploaded on Social Media
A Hyderabad influencer’s jilted ex posts her intimate videos on Instagram Reels to humiliate her publicly. Direct reporting via Instagram’s in-app form triggers the 24-hour clock. Meta must disable the account temporarily, notify the victim of removal, and share metadata with cyber police for FIR registration under IPC 354A (sexual harassment).
5. Deepfake Sexual Videos
An AI-generated clip showing a Chennai teacher’s face in a pornographic scene goes viral on YouTube Shorts. Classified as “artificially morphed images,” the SOP treats deepfakes with urgency. YouTube removes the video, submits its hash to I4C’s bank, and deploys AI crawlers to preempt similar GAN-generated fakes, leveraging tools like Microsoft’s PhotoDNA.
6. Leaked Cloud Storage Images
Private iCloud photos of a Pune professional, stolen via phishing, appear on Reddit and 4chan. The victim, fearing job loss, reports to cyber police. Intermediaries coordinate with Apple for source disabling; the SOP mandates victim notification within 36 hours post-takedown, alongside LEA preservation of evidence for prosecution under Section 67A (explicit content transmission).
7. Search Engine Results Showing Intimate Content
Even after source deletion, a Kolkata victim’s leaked nudes linger in Google Image results via cached thumbnails. She files via NCRP. Google must de-index URLs within 24 hours, per SOP mandates, and implement “right to be forgotten”-style filters for NCII hashes, preventing algorithmic resurfacing.
Algoritha: The Most Trusted Name in BFSI Investigations and DFIR Services
8. Multiple Re-Uploads Across New URLs
Trolls in a coordinated harassment campaign re-upload the same NCII every few hours via bit.ly shorteners and fresh domains. Hash banks prove game-changing: platforms like Facebook auto-block 99% of variants using I4C-shared perceptual hashes. Crawler tech from DoT scans proactively, throttling domains at the ISP level.
9. Victim Is Scared or Unsure How to Report
A survivor in rural Rajasthan, paralyzed by shame, confides in a Sakhi OSC counselor. The SOP’s flexibility shines—no tech savvy required. OSCs file on her behalf, police provide transport if needed, and all agencies prioritize trauma-informed handling, with MeitY mandating quarterly training for grievance officers.
Experts hail the SOP as a “systemic muscle” long overdue. “This isn’t just regulation; it’s reaffirmation of consent in the digital age,” says digital rights advocate Priya Rao. The Internet Freedom Foundation praises hash-sharing but flags privacy risks in centralized banks. Rati Foundation’s Siddharth Pillai notes the explicit-nudity focus may exclude semi-nude or suggestive private images, urging amendments.
With NCRP logging 4,200+ NCII complaints in October 2025 alone—a 40% YoY spike—this SOP could slash response times from 15 days to under 24 hours. MeitY plans awareness drives via #SafeDigitalBharat, targeting 100 million users by March 2026. As AI deepfakes proliferate (I4C detected 1,800 cases in Q3), the protocol signals zero-tolerance, blending cutting-edge tech with empathetic enforcement. For victims, dignity is no longer a plea—it’s now a 24-hour guarantee.
