Cyber Crime
YouTube Takes a Stand: No Child Sexual Abuse Material Found on Platform
NEW DELHI: YouTube, the popular video-sharing platform, revealed that it had conducted exhaustive investigations into allegations of child sexual abuse material (CSAM) on its platform and found no evidence of such content. The statement comes after the Ministry of Electronics and Information Technology (MeitY) directed several social media platforms, including YouTube, to take proactive measures in removing CSAM content or risk losing their ‘safe harbour’ immunity. However, the ministry did not specify the reasons behind targeting these platforms.
YouTube’s Assertion of Innocence
YouTube’s declaration of innocence was accompanied by a firm stance against CSAM. The company’s spokesperson stated, “Based on our investigations, we found no evidence of CSAM on YouTube. We are committed to preventing the spread of such content and will continue to invest heavily in the technologies and teams that detect and remove it.”
The company, which boasts approximately 467 million active users in India, its largest user base globally, has submitted its formal response to the government on this matter. YouTube’s assertion comes in response to the proactive measures recommended by MeitY.
ALSO READ: FCRF Report: India Battling Cyber Threats with Online Financial Fraud Dominating at 77.41%
Aggressive Actions Against CSAM
Earlier this month, Rajeev Chandrasekhar, an official, issued notices to various prominent platforms, urging them to clean their platforms of explicit content that could potentially make children vulnerable to exploitation. These notices recommended implementing proactive measures, including content moderation algorithms and reporting mechanisms, to prevent the future dissemination of CSAM.
YouTube’s Stringent Child Safety Policy
YouTube’s child safety policy explicitly prohibits sexually explicit content featuring minors and any content that sexually exploits them. In line with this policy, the platform has made strides in combating CSAM. According to the company’s internal data, during the second quarter of FY24, YouTube removed over 94,000 channels and more than 2.5 million videos for violating its child safety policy. These actions underscore the company’s dedication to safeguarding its users, particularly the younger audience.
ALSO READ: Loud Beeps and Flash Alerts: India’s First Nationwide Emergency Test Shakes Up Phones
Technology to Combat CSAM
Google, YouTube’s parent company, has also taken steps to combat CSAM. In 2018, they introduced a content safety application programming interface (API) that employs artificial intelligence classifiers to help organizations identify and prioritize CSAM content for review. This proactive approach underscores the commitment to ensuring that platforms are free from harmful and explicit content.
While YouTube has asserted its innocence in relation to the presence of CSAM on its platform, the ongoing efforts to combat such content remain crucial. The platform, along with other social media giants, continues to work towards providing a safe and secure online environment for all users, particularly the vulnerable.
The Ministry of Electronics and Information Technology, in its role as a regulator, may continue to monitor and regulate these platforms to ensure the protection of India’s internet users, particularly its younger population, from harmful content and abuse.
Follow The420.in on
Telegram | Facebook | Twitter | LinkedIn | Instagram | YouTube